April, 2022
import pandas as pd
import numpy as np
import seaborn as sns
# To plot pretty figures
%matplotlib inline
import matplotlib as mpl
import matplotlib.pyplot as plt
mpl.rc('axes', labelsize=14)
mpl.rc('xtick', labelsize=12)
mpl.rc('ytick', labelsize=12)
import sklearn
import tensorflow as tf
from tensorflow import keras
import plotly.express as px
import warnings
from sklearn.impute import KNNImputer
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.metrics import r2_score, mean_absolute_error, mean_squared_error
from sklearn.model_selection import cross_val_score
from sklearn.linear_model import LinearRegression
from sklearn.linear_model import Ridge
from sklearn.linear_model import RidgeCV
from sklearn.linear_model import Lasso
from sklearn.linear_model import LassoCV
from sklearn.linear_model import ElasticNet
from sklearn.linear_model import ElasticNetCV
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import RepeatedKFold
warnings.filterwarnings("ignore")
The challenge is based on a regression problem that involves predicting energy efficiency. More specifically, I will perform an analysis using various building shapes, each with its own set of attributes, and predict the building's heating load. The buildings differ in terms of glazing area, glazing area distribution, orientation, and other characteristics included in the dataset.
The dataset includes 9 attributes denoted by X0, X1,..., X8, as well as an outcome variable Y that must be predicted. The following are the definitions of the ten variables:
data = pd.read_csv("datcw_na.csv")
data.head()
| X0 | X1 | X2 | X3 | X4 | X5 | X6 | X7 | X8 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | C3 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 1.98 | 0.0 | 0.0 | 15.55 |
| 1 | C1 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 3.00 | 0.0 | 0.0 | 15.55 |
| 2 | C1 | 0.88 | 463.05 | 291.06 | 99.23 | 5.67 | 4.40 | 0.0 | 0.0 | 15.55 |
| 3 | C2 | 0.79 | 509.36 | 291.06 | 121.28 | 6.30 | 4.05 | 0.0 | 0.0 | 15.55 |
| 4 | C1 | 0.89 | 507.15 | 385.39 | 121.28 | 7.70 | 2.00 | 0.0 | 0.0 | 20.84 |
data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 768 entries, 0 to 767 Data columns (total 10 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 X0 768 non-null object 1 X1 768 non-null float64 2 X2 768 non-null float64 3 X3 728 non-null float64 4 X4 768 non-null float64 5 X5 768 non-null float64 6 X6 768 non-null float64 7 X7 768 non-null float64 8 X8 768 non-null float64 9 Y 768 non-null float64 dtypes: float64(9), object(1) memory usage: 60.1+ KB
data.describe()
| X1 | X2 | X3 | X4 | X5 | X6 | X7 | X8 | Y | |
|---|---|---|---|---|---|---|---|---|---|
| count | 768.000000 | 768.000000 | 728.000000 | 768.000000 | 768.000000 | 768.000000 | 768.000000 | 768.000000 | 768.000000 |
| mean | 0.763516 | 666.768997 | 321.102527 | 176.564141 | 5.229766 | 3.527331 | 0.237852 | 2.803737 | 22.307201 |
| std | 0.147093 | 120.863329 | 60.479340 | 51.280618 | 1.844813 | 1.245710 | 0.139736 | 1.597817 | 10.090196 |
| min | 0.500000 | 416.740000 | 198.450000 | 89.310000 | 2.840000 | 1.620000 | 0.000000 | 0.000000 | 6.010000 |
| 25% | 0.650000 | 575.510000 | 277.830000 | 132.300000 | 3.470000 | 2.427500 | 0.100000 | 1.517500 | 12.992500 |
| 50% | 0.750000 | 661.500000 | 315.320000 | 178.235000 | 4.955000 | 3.600000 | 0.240000 | 2.970000 | 18.950000 |
| 75% | 0.860000 | 741.130000 | 355.740000 | 218.300000 | 6.930000 | 4.425000 | 0.360000 | 3.960000 | 31.667500 |
| max | 1.190000 | 978.290000 | 503.970000 | 266.800000 | 8.470000 | 6.050000 | 0.480000 | 6.050000 | 43.100000 |
Generally, there are no clear patterns we can infer from the functions above, we use them to get a general sense of data. Our dataset has 768 entries/rows and 10 columns and, except for variable X0, all variables are floating-point numbers. However, one of the columns, specifically the variable X3 , contains only 728 lines with non-null values, indicating that missing data exists. In addition, we can see a discrepancy between the median and the mean in the various columns, which suggest some skewness, meaning the data is not distributed perfectly symmetrical, as the mean and median values would otherwise be equal. The standard deviation, which represents the dispersion of a set of data values from their mean, is another interesting metric to consider here. When a variable's standard deviation is low, it means that data points are close to the mean, and vice versa. When there is missing data, such as in the case of variable X3, this value might assist us in making decisions about how to manage the problem.
# let's visually inspect the data with histograms showing bars of frequencies of numeric values grouped in bins.
data.hist(bins=30, figsize=(20,15))
plt.show()
for i in data.columns:
if(data[i].dtype!='object'):
sns.boxplot(data[i])
plt.title(i)
plt.show()
len(data.loc[data['X1'] > 1.1])
13
data.loc[data['X1'] > 1.1].head()
| X0 | X1 | X2 | X3 | X4 | X5 | X6 | X7 | X8 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | C3 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 1.98 | 0.00 | 0.00 | 15.55 |
| 1 | C1 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 3.00 | 0.00 | 0.00 | 15.55 |
| 96 | C3 | 1.19 | 514.50 | 238.14 | 121.28 | 6.93 | 1.80 | 0.09 | 1.98 | 24.29 |
| 194 | C3 | 1.19 | 416.74 | 323.40 | 133.41 | 6.93 | 4.84 | 0.08 | 4.40 | 24.04 |
| 291 | C1 | 1.19 | 416.74 | 291.06 | 99.23 | 6.93 | 6.05 | 0.24 | 0.99 | 28.41 |
len(data.loc[data['X3'] > 450])
20
data.loc[data['X3'] > 450].head()
| X0 | X1 | X2 | X3 | X4 | X5 | X6 | X7 | X8 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 22 | C1 | 0.75 | 661.50 | 458.15 | 148.23 | 6.30 | 3.96 | 0.00 | 0.00 | 24.77 |
| 23 | C3 | 0.92 | 654.88 | 503.97 | 122.50 | 7.70 | 6.05 | 0.00 | 0.00 | 23.93 |
| 165 | C2 | 0.92 | 727.65 | 503.97 | 148.23 | 6.30 | 2.43 | 0.11 | 2.97 | 33.28 |
| 212 | C1 | 0.84 | 654.89 | 503.97 | 110.25 | 6.93 | 2.42 | 0.11 | 4.84 | 33.08 |
| 214 | C2 | 0.75 | 654.89 | 458.15 | 121.28 | 8.47 | 3.96 | 0.09 | 3.24 | 33.09 |
We can see that the variables X1 and X3 have a few outliers. However, because these outliers are not dramatically distance from the other points, they do not seem to be influential points ("extreme outliers") that would significantly affect the slope of the regression line, so I do not believe it is necessary to exclude them from the data. They appear to be critical to our analysis and results. Additionally, it should be noted that, as seen before, none of our numerical data follows a normal distribution; yet, the variables X4 and X7 are the closest to this scenario.
The corr() function has as default method Pearson Correlation in which coefficients are indicators of how strong is the linear relationship between two different variables, x and y. A linear relationship is used to describe a straight-line relationship between two variables. Under this assumption, the two variables have a direct connection, which means if the value of x is changed, y must also change. For example, if the volume of a material is doubled, the weight of the material will also double, this is a perfect linear relationship.
The correlation coefficient is given in the range -1.0 to 1.0. A positive association is indicated by a correlation coefficient greater than zero, which means that if the value of one variable rises, the value of the other tends to increase as well. On a scatterplot, positive relationships produce an upward slope. A negative correlation is shown by a correlation coefficient smaller than zero, which means that as the value of one variable rises, the value of the other variable tends to fall. Negative relationships produce a downward slope. In other words, a positive correlation indicates that both variables are moving in the same direction, whereas a negative correlation indicates that they are moving in opposite directions. Finally, a value of zero shows that the two variables x and y have no linear relationship, which means that changes in the output are not proportional to changes in the input and, in these case, when one variable increases, there is no tendency in the other variable to either increase or decrease.
There is a relationship when the value is between 0 and +1/-1, but the points do not all fall on a line. The strength of the association grows as r approaches -1 or 1, and the data points tend to fall closer to a straight line.
mask = np.triu(np.ones_like(data.corr(), dtype=bool))
plt.figure(figsize = (20, 10))
sns.heatmap(data.corr(),mask =mask, annot = True, cmap="YlGnBu")
plt.show()
corr_matrix = data.corr()
corr_matrix["Y"].sort_values(ascending=False)
Y 1.000000 X5 0.815769 X1 0.454177 X3 0.312449 X7 0.255901 X8 0.087106 X6 0.001340 X2 -0.481192 X4 -0.771040 Name: Y, dtype: float64
The predictors that appear to have a statistically significant relationship to the response are: X5, X1, X3, X7, X2 and X4. However, some variables present stronger correlations with the y variable than others, presenting correlation coefficients closer to 1 and -1. We also observe that some of our predictors have substantial associations with one another, as seen, for example, by the positive correlation values of 0.57 for variables X1 and X5 and variables X4 and X2. Consequently, an effect in one of the is absorbed by the other.
The positive or negative linear relationship between the predictors that present stronger correlations with the variable y; the regression coefficients that the further away from zero the greater the impact of the predictor on the variable y; the distributions and density of the variables through the box and violin plots; and, finally, the R-squared that represents the proportion of the variance for a dependent variable that is explained by the independent variable are all illustrated in the scatterplots below. As a result, the higher the R-squared the better. Additionally, the R-squared provides a measure of the strength of the relationship between the model and the response variable.
# Relative Compactness vs Heating Load
h = None
px.scatter(data, x=data['X1'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
## Surface Area vs Heating Load
px.scatter(data, x=data['X2'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
# Wall Area vs Heating Load
px.scatter(data, x=data['X3'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
# Roof Area vs Heating Load
px.scatter(data, x=data['X4'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
# Overall Height vs Heating Load
px.scatter(data, x=data['X5'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
#Glazing Area vs Heating Load
px.scatter(data, x=data['X7'], y=data['Y'],
color=h, marginal_y="violin", marginal_x="box",
trendline="ols", template="simple_white")
#Dropping features which have close to 0 correlations with the outcome
data = data.drop(['X6', 'X8'], axis=1)
data.head()
| X0 | X1 | X2 | X3 | X4 | X5 | X7 | Y | |
|---|---|---|---|---|---|---|---|---|
| 0 | C3 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 0.0 | 15.55 |
| 1 | C1 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 0.0 | 15.55 |
| 2 | C1 | 0.88 | 463.05 | 291.06 | 99.23 | 5.67 | 0.0 | 15.55 |
| 3 | C2 | 0.79 | 509.36 | 291.06 | 121.28 | 6.30 | 0.0 | 15.55 |
| 4 | C1 | 0.89 | 507.15 | 385.39 | 121.28 | 7.70 | 0.0 | 20.84 |
Treatment of categorical variables is one of the data preprocessing phases, and it is a crucial step because most machine learning algorithms can not handle categorical variables unless they are converted to numerical values, and even those that can perform better when all variables are numeric and preferably of the same dtype. Ordinal Encoding and One-Hot Encoding are two of the most used methods for dealing with categorical variables. In this example, we will use the One-Hot Encoding approach, which can be done in two ways: one, by using get_dummies in pandas and two, by using OneHotEncoder from sklearn. Ordinal Encoding maps each unique category value to a specific numerical value depending on its order or rank, converting ordinal categories into ordered numerical values. In our case, we do not want to provide our categorical variable any order, and we do not want our model to account for it, resulting in poor performance.
data["X0"].unique()
array(['C3', 'C1', 'C2'], dtype=object)
data["X0"].value_counts()
C2 265 C3 260 C1 243 Name: X0, dtype: int64
data = pd.get_dummies(data=data,columns=['X0'])
data.head()
| X1 | X2 | X3 | X4 | X5 | X7 | Y | X0_C1 | X0_C2 | X0_C3 | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 0.0 | 15.55 | 0 | 0 | 1 |
| 1 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 0.0 | 15.55 | 1 | 0 | 0 |
| 2 | 0.88 | 463.05 | 291.06 | 99.23 | 5.67 | 0.0 | 15.55 | 1 | 0 | 0 |
| 3 | 0.79 | 509.36 | 291.06 | 121.28 | 6.30 | 0.0 | 15.55 | 0 | 1 | 0 |
| 4 | 0.89 | 507.15 | 385.39 | 121.28 | 7.70 | 0.0 | 20.84 | 1 | 0 | 0 |
data.columns
Index(['X1', 'X2', 'X3', 'X4', 'X5', 'X7', 'Y', 'X0_C1', 'X0_C2', 'X0_C3'], dtype='object')
#Reorder dataframe
data = data[['X0_C1', 'X0_C2','X0_C3', 'X1', 'X2', 'X3', 'X4', 'X5','X7', 'Y']]
data.head()
| X0_C1 | X0_C2 | X0_C3 | X1 | X2 | X3 | X4 | X5 | X7 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0 | 1 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 0.0 | 15.55 |
| 1 | 1 | 0 | 0 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 0.0 | 15.55 |
| 2 | 1 | 0 | 0 | 0.88 | 463.05 | 291.06 | 99.23 | 5.67 | 0.0 | 15.55 |
| 3 | 0 | 1 | 0 | 0.79 | 509.36 | 291.06 | 121.28 | 6.30 | 0.0 | 15.55 |
| 4 | 1 | 0 | 0 | 0.89 | 507.15 | 385.39 | 121.28 | 7.70 | 0.0 | 20.84 |
When dealing with missing data, we have 3 options:
When the fraction of missing data is small, the imputation method can be quite useful. If the portion of missing data is too high, the imputation method will result in data that will lack natural variation . On the other hand, it is important to remember that the value we use to fill in the missing entries, such as the column's median, is sensible and appropriate for the variable we're working with. As a result, it is important to try to figure out why the data is missing and to fully understand the variable we are dealing with. If the percentage of missing data is small, the choice to delete rows with missing data also makes sense; otherwise, we may wind up with an insufficient number of observations to produce a reliable analysis. Finally, removing the variable/column from our analysis appears sustainable only if we are faced with a high percentage of non-existent values, such as 60%, because the observations we have may be questionable and not representative of the population, and choosing any of the other options would result in one of the less favourable scenarios mentioned above.
data.isnull().sum().X3
40
data.describe().X3
count 728.000000 mean 321.102527 std 60.479340 min 198.450000 25% 277.830000 50% 315.320000 75% 355.740000 max 503.970000 Name: X3, dtype: float64
data[data['X3'].isnull()].head()
| X0_C1 | X0_C2 | X0_C3 | X1 | X2 | X3 | X4 | X5 | X7 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0 | 1 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 0.00 | 15.55 |
| 34 | 1 | 0 | 0 | 0.68 | 661.50 | NaN | 178.60 | 3.47 | 0.00 | 6.77 |
| 88 | 0 | 1 | 0 | 0.64 | 776.16 | NaN | 242.55 | 3.85 | 0.10 | 15.41 |
| 121 | 0 | 0 | 1 | 0.73 | 617.40 | NaN | 218.30 | 3.47 | 0.09 | 10.46 |
| 168 | 1 | 0 | 0 | 0.67 | 754.60 | NaN | 242.55 | 3.47 | 0.11 | 10.39 |
Using a model to forecast missing values is an effective technique to data imputing. For each feature with missing values, a model is built using the values of perhaps all other input features as input. In our case, I'll use the Nearest Neighbor Imputation using KNNImputer technique, in which the mean value from n_neighbors nearest neighbors found in the training set is used to impute each sample's missing values.
# define imputer
imputer = KNNImputer(n_neighbors=5, weights='uniform', metric='nan_euclidean')
# fit on the dataset
imputer.fit(data)
# transform the dataset
fill_nan = imputer.transform(data)
# transform into a dataframe
fill_nan = pd.DataFrame(fill_nan)
fill_nan.head()
| 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0.0 | 0.0 | 1.0 | 1.19 | 622.55 | 292.088 | 89.31 | 7.00 | 0.0 | 15.55 |
| 1 | 1.0 | 0.0 | 0.0 | 1.19 | 622.55 | 323.400 | 109.15 | 7.70 | 0.0 | 15.55 |
| 2 | 1.0 | 0.0 | 0.0 | 0.88 | 463.05 | 291.060 | 99.23 | 5.67 | 0.0 | 15.55 |
| 3 | 0.0 | 1.0 | 0.0 | 0.79 | 509.36 | 291.060 | 121.28 | 6.30 | 0.0 | 15.55 |
| 4 | 1.0 | 0.0 | 0.0 | 0.89 | 507.15 | 385.390 | 121.28 | 7.70 | 0.0 | 20.84 |
data.head()
| X0_C1 | X0_C2 | X0_C3 | X1 | X2 | X3 | X4 | X5 | X7 | Y | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0 | 1 | 1.19 | 622.55 | NaN | 89.31 | 7.00 | 0.0 | 15.55 |
| 1 | 1 | 0 | 0 | 1.19 | 622.55 | 323.40 | 109.15 | 7.70 | 0.0 | 15.55 |
| 2 | 1 | 0 | 0 | 0.88 | 463.05 | 291.06 | 99.23 | 5.67 | 0.0 | 15.55 |
| 3 | 0 | 1 | 0 | 0.79 | 509.36 | 291.06 | 121.28 | 6.30 | 0.0 | 15.55 |
| 4 | 1 | 0 | 0 | 0.89 | 507.15 | 385.39 | 121.28 | 7.70 | 0.0 | 20.84 |
#Fill missing values in X3 original column
data['X3'] = fill_nan[5]
data.isnull().sum()
X0_C1 0 X0_C2 0 X0_C3 0 X1 0 X2 0 X3 0 X4 0 X5 0 X7 0 Y 0 dtype: int64
In order to start training models to predict heating load, we need to split up the data into an X array that contains the features to train on, and a y array with the target variable. This division will be based on the stratified sampling approach, which is a method of obtaining a representative sample from a population that has been separated into approximately comparable subpopulations. Stratified sampling is used by researchers to ensure that specified subgroups are represented in their sample. As a result, it aids in retaining the whole diversity of the population in the sample. When members of subpopulations are homogeneous in comparison to the total population, stratified sampling can give more accurate estimates than simply random sample. This increases the statistical power of a study. For our division, we will use the X5 variable because it has a stronger correlation with the Y variable.
data['X5'].hist()
<AxesSubplot:>
data['X5_bin'] = pd.cut(data['X5'],
bins=[2., 3.5, 5.0, 6.5, 8., np.inf],
labels=[1, 2, 3, 4, 5])
data['X5_bin'].value_counts()
1 272 4 216 3 122 2 112 5 46 Name: X5_bin, dtype: int64
data['X5_bin'].hist()
<AxesSubplot:>
# now we do stratified sampling on X5_bin
from sklearn.model_selection import StratifiedShuffleSplit
split = StratifiedShuffleSplit(n_splits=1, test_size=0.3, random_state=42)
for train_index, test_index in split.split(data, data['X5_bin']):
strat_train_set = data.loc[train_index]
strat_test_set = data.loc[test_index]
def X5_proportions(data):
return data['X5_bin'].value_counts() / len(data)
train_set, test_set = train_test_split(data, test_size=0.3, random_state=42)
compare_props = pd.DataFrame({
"Overall": X5_proportions(data),
"Stratified": X5_proportions(strat_test_set),
"Random": X5_proportions(test_set),
}).sort_index()
compare_props["Rand. %error"] = 100 * compare_props["Random"] / compare_props["Overall"] - 100
compare_props["Strat. %error"] = 100 * compare_props["Stratified"] / compare_props["Overall"] - 100
compare_props
| Overall | Stratified | Random | Rand. %error | Strat. %error | |
|---|---|---|---|---|---|
| 1 | 0.354167 | 0.354978 | 0.324675 | -8.326967 | 0.229183 |
| 2 | 0.145833 | 0.142857 | 0.142857 | -2.040816 | -2.040816 |
| 3 | 0.158854 | 0.160173 | 0.160173 | 0.830317 | 0.830317 |
| 4 | 0.281250 | 0.281385 | 0.324675 | 15.440115 | 0.048100 |
| 5 | 0.059896 | 0.060606 | 0.047619 | -20.496894 | 1.185771 |
We can observe that the stratified sample test set's distribution of the variable X5 is considerably more close to the original dataset than the random test set's distribution.
# after stratified sampling, we drop this variable from the train and test datasets
for set_ in (strat_train_set, strat_test_set):
set_.drop("X5_bin", axis=1, inplace=True)
X_train = strat_train_set.iloc[:, :9].reset_index(drop=True) #Get all rows and all columns except the last one
y_train = strat_train_set['Y'].reset_index(drop=True)
X_test = strat_test_set.iloc[:, :9].reset_index(drop=True)
y_test = strat_test_set['Y'].reset_index(drop=True)
print(X_train.shape)
print(y_train.shape)
print(X_test.shape)
print(y_test.shape)
(537, 9) (537,) (231, 9) (231,)
It would be difficult to feed values into a model that have widely varying ranges. The model would be able to adapt to such a wide range of data automatically, but it would make learning more challenging. As a result, feature normalisation (Min-Max scaling) or standardisation (Z-Score Normalization) is required. The first is rescaling the data from the original range so that all values are within the new range of 0 and 1. This scaling is really affected by outliers and can be useful in algorithms like K-Nearest Neighbors and Neural Network that do not assume any distribution of the data; the second is standardisation, which involves rescaling the distribution of values so that the mean of observed values is 0 and the standard deviation is 1. Because there is no predefined range of transformed features, this scaling is less affected by outliers, and it is more effective when data follows a Normal distribution. However, this last part does not have to be necessarily true, but it is in general advantageous when data has variable sizes and the technique you are using, such as linear regression, logistic regression, and linear discriminant analysis, makes assumptions about the data having a Gaussian distribution. Taking everything into account, and considering the models I will be using to train our data, I will use standardisation. However, I believe that normailization would also do a great job.
# numerical features
num_cols = ['X1', 'X2', 'X3', 'X4', 'X5','X7']
# apply standardization on numerical features
for i in num_cols:
# fit on training data column
scale = StandardScaler().fit(X_train[[i]]) # [[]] cause if we put only X_train[i] it returns a serie, 1D array, and fit function expects a 2D array
# transform the training data column
X_train[i] = scale.transform(X_train[[i]])
# transform the testing data column
X_test[i] = scale.transform(X_test[[i]])
Standardisation was only applied to my numerical columns, not on the One-Hot Encoded features. Standardising the One-Hot encoded features would mean assigning a distribution to categorical features and we do not want to do that. The One-Hot encoded features are already in the range of 0 to 1, therefore normalisation would have no effect on their value if I chose to normalise the data. It is also worth mentioning that the numbers used to standardise the test data are generated from the training data. Even for something as simple as normalisation or standardisation, we should never use any quantity computed on the test data.
I'm going to use the Random Forest Importance technique for feature selection. Random Forests are a type of Bagging Algorithm that aggregates a set of decision trees. Random forests' tree-based tactics are naturally ranked by how well they increase node purity, or, in other words, how well they reduce impurity over all trees. The nodes with the largest drop in impurity are found at the beginning of the trees, while the notes with the least decrease in impurity are found at the end. We can produce a subset of the most essential features by pruning trees below a specific node.
from sklearn.ensemble import RandomForestRegressor
forest_reg = RandomForestRegressor(n_estimators=100, random_state=42)
forest_reg.fit(X_train, y_train)
# evaluating the random forest model on the training set
housing_predictions = forest_reg.predict(X_train)
forest_mse = mean_squared_error(y_train, housing_predictions)
forest_rmse = np.sqrt(forest_mse)
forest_rmse
1.030877923584229
# first define a function to display (error) scores in a CV, on all test folds, and their average and standard deviation
def display_scores(scores):
print("Scores:", scores)
print("Mean:", scores.mean())
print("Standard deviation:", scores.std())
# evaluating random forest within CV, it shows a bigger rmse error than on training set so we can say it also overfitted the training set
from sklearn.model_selection import cross_val_score
forest_scores = cross_val_score(forest_reg, X_train, y_train,
n_jobs=-1, scoring="neg_mean_squared_error", cv=5)
forest_rmse_scores = np.sqrt(-forest_scores)
display_scores(forest_rmse_scores)
Scores: [2.62793967 2.88026107 3.01775082 2.81764205 3.03064399] Mean: 2.8748475203031782 Standard deviation: 0.14762055981438577
# tunning a random forest model; this is optimising the model over a set/grid of values for the hyperparameters
# n_estimators is the number of trees whose predictions are averaged
# max_features is the number of variables randomly selected to be evaluated in a node for predictive power; the best is used in the node
# bootstrap means whether data points may be sampled with repetition (multiple times) for training
from sklearn.model_selection import GridSearchCV
param_grid = [
# try 12 (3×4) combinations of hyperparameters
{'n_estimators': [3, 10, 30], 'max_features': [2, 4, 6, 8]},
# then try 6 (2×3) combinations with bootstrap set as False
{'bootstrap': [False], 'n_estimators': [3, 10], 'max_features': [2, 3, 4]},
]
forest_reg = RandomForestRegressor(random_state=42)
# train across 5 folds, that's a total of (12+6)*5=90 rounds of training
grid_search = GridSearchCV(forest_reg, param_grid, cv=5, n_jobs=-1,
scoring='neg_mean_squared_error',
return_train_score=True)
grid_search.fit( X_train, y_train)
GridSearchCV(cv=5, estimator=RandomForestRegressor(random_state=42), n_jobs=-1,
param_grid=[{'max_features': [2, 4, 6, 8],
'n_estimators': [3, 10, 30]},
{'bootstrap': [False], 'max_features': [2, 3, 4],
'n_estimators': [3, 10]}],
return_train_score=True, scoring='neg_mean_squared_error')
grid_search.best_estimator_
RandomForestRegressor(max_features=6, n_estimators=30, random_state=42)
feature_importances = grid_search.best_estimator_.feature_importances_
feature_importances
array([0.00245382, 0.00185523, 0.00168107, 0.04050883, 0.04798213,
0.03481492, 0.30608497, 0.47009564, 0.0945234 ])
attributes = X_train.columns
sorted(zip(feature_importances, attributes), reverse=True)
[(0.47009563835161206, 'X5'), (0.3060849717767887, 'X4'), (0.09452340051969672, 'X7'), (0.047982125076286866, 'X2'), (0.04050883085690441, 'X1'), (0.03481491829763231, 'X3'), (0.002453822874099831, 'X0_C1'), (0.0018552258312598215, 'X0_C2'), (0.0016810664157192936, 'X0_C3')]
X_train = X_train [['X1','X2','X3','X4','X5','X7']]
X_train.head()
| X1 | X2 | X3 | X4 | X5 | X7 | |
|---|---|---|---|---|---|---|
| 0 | -0.636836 | 1.396517 | -1.341560 | 0.779251 | -1.297523 | -1.686515 |
| 1 | -1.681798 | 2.395504 | 0.975381 | 1.709735 | -1.129402 | 0.325102 |
| 2 | -1.263813 | 1.646222 | -0.878171 | 0.779251 | -1.297523 | 1.474597 |
| 3 | -1.054821 | 2.645293 | 0.743601 | 1.709735 | -0.955858 | 0.325102 |
| 4 | -0.427844 | 0.329459 | 0.095030 | 0.779251 | -0.955858 | 0.899849 |
X_test = X_test[['X1','X2','X3','X4','X5','X7']]
X_test.head()
| X1 | X2 | X3 | X4 | X5 | X7 | |
|---|---|---|---|---|---|---|
| 0 | 2.985697 | -2.085532 | -0.507461 | -1.505133 | 0.920591 | 0.037728 |
| 1 | -1.403141 | -0.079302 | 0.743773 | 1.244493 | -0.955858 | 1.761971 |
| 2 | -1.403141 | 1.214883 | 0.604671 | 0.017598 | -0.955858 | 1.474597 |
| 3 | -1.403141 | 2.645293 | 0.174984 | 0.779251 | -1.129402 | -0.249646 |
| 4 | 0.199133 | -1.305247 | -0.962424 | -1.505133 | 0.958554 | 1.761971 |
We shouldn't tweak the model depending on the test set to avoid information leaks, hence we need also set aside a validation set. Setting aside a portion of the training set as the validation set, training on the remaining data, and then evaluating on the validation set. In the test set, we just evaluate our model once. Additionally, the validation set is also used to optimize the model parameters.
partial_X_train, X_val, partial_y_train, y_val = train_test_split(X_train, y_train, random_state=42)
print(partial_X_train.shape)
print(partial_y_train.shape)
print(X_val.shape)
print(y_val.shape)
(402, 6) (402,) (135, 6) (135,)
Mean of the squared errors.
Where y_i is the i’th expected value in the dataset and yhat_i is the i’th predicted value. The difference between these two values is squared, which has the effect of removing the sign, resulting in a positive error value.
Large errors are also inflated as a result of the squaring. The wider the discrepancy between the predicted and true values, the larger the squared positive error. When MSE is employed as a loss function, this has the effect of "punishing" models more for higher errors. When employed as a metric, it also has the effect of "punishing" models by raising the average error score.
The units of the MSE are squared units.
The square root of the mean of the squared errors.
Where y_i is the i’th expected value in the dataset, yhat_i is the i’th predicted value, and sqrt() is the square root function.
The RMSE, or Root Mean Squared Error, is a variant of the mean squared error. However, as the square root of the error is calculated, the RMSE units are the same as the original units of the target value. For higher values, MSE is heavily biased. When dealing with huge error values, RMSE is superior at expressing performance. As a result, when lower residual values are preferred, RMSE is more useful.
Mean of the absolute value of the errors.
Where y_i is the i’th expected value in the dataset, yhat_i is the i’th predicted value and abs() is the absolute function.
MAE is a common statistic because, like RMSE, the error score's units match the units of the target value. The changes in MAE, unlike the RMSE, are linear and thus intuitive.
MSE and RMSE punish greater errors more harshly than smaller ones, inflating the mean error score. The MAE does not give distinct types of errors more or less weight; instead, the scores grow linearly as the amount of error increases.
The R-squared value is always between 0 and 100%. R-squared is a statistical measure of how close the data are to the fitted regression line, indicating the percentage of the variance in the dependent variable that the independent variables explain collectively. R-squared cannot determine whether the coefficient estimates and predictions are biased, so we must analyse the residual plots. In some fields, it's completely normal to have low R-squared values. R-squared values of less than 50% are normal in any science that seeks to predict human behaviour, such as psychology.
Cross-validation is a resampling technique for evaluating machine learning models, specially, on a small sample of data. The process includes only one parameter, k, which specifies the number of groups into which a given data sample should be divided. As a result, the process is frequently referred to as k-fold cross-validation. For each group i, we train a model on the remaining K-1 partitions , and evaluate it on the partition i. The final score is the the average of the k-scores obtained. This method is helpful when the perfomance of a model shows significance variance based on the train-test-split. Importantly, each observation in the data sample is assigned to a distinct group and remains there throughout the process. This means that each sample has the chance to be used in the hold out set once and to train the model k-1 times. This strategy provides a more accurate estimate of the algorithm's performance on new data because the algorithm is trained and assessed several times on different data. The size of each test partition must be large enough to constitute a good sample of the problem, whilst allowing enough repetitions of the train-test evaluation of the algorithm to provide a fair estimate of the algorithms performance on unseen data. K values of 3, 5, and 10 are common for modest size datasets with thousands or tens of thousands of observations.
cv = RepeatedKFold(n_splits=5, n_repeats=3, random_state=42)
def rmse_cross_validation(model):
rmse_cv = np.sqrt(-cross_val_score(model, X_train, y_train, n_jobs=-1, scoring='neg_mean_squared_error', cv=cv)).mean()
return rmse_cv
def mae_cross_validation(model):
mae_cv = (-cross_val_score(model, X_train, y_train, n_jobs=-1, scoring='neg_mean_absolute_error', cv=cv)).mean()
return mae_cv
def evaluation(true, predicted):
mae = mean_absolute_error(true, predicted)
mse = mean_squared_error(true, predicted)
rmse = np.sqrt(mean_squared_error(true, predicted))
r_squared = r2_score(true, predicted)
return mae, mse, rmse, r_squared
Let's start with linear regression, also known as ordinary least squares (OLS) and linear least squares, is the simplest algorithm in machine learning, but despite its simplicity it often works well for different types of data. In machine learning problems, if you have a choice between two models, one sophisticated and the other much simpler, we should choose the simpler one if the simpler model represents the data as well as the complex model.
One of the most common problems in model training is called overfitting, and it occurs when the model fits the training data extremely well, learning the details, patterns, and noise in the training data, what negatively affects the model's performance in new data, never seen before. In other words, the model fits the training very closely but perform poorly in the unseen/test data. The goal of machine learning is generalisation, the ability of make good predictions on new data.
As can be seen above (heatmap), there is a substantial correlation between the independent variables, which might rise to an issue known as multicollinearity, which favours the overfitting scenario. This correlation is a concern because independent variables should be independent. Furthermore, when independent variables are correlated, it means that shifts in one variable are linked to shifts in the other. It is more difficult to adjust one variable without changing another when the relationship is significant and it becomes challenging for the model to estimate the relationship between each independent variable and the dependent variable individually.
A regression coefficient represents the mean change in the dependent variable for each 1 unit change in an independent variable when you hold all of the other variables constant.
Multicollinearity is responsible for the following two sorts of issues:
It increases the variance of the coefficient estimates and they become very sensitive to small changes in the model.
It affects the precision of computed coefficients, lowering regression model's statistical power.
OLS has several weaknesses, including a sensitivity to both outliers, multicollinearity and heteroscedasticity.
Heteroscedasticity refers to situations where residuals for a regression model do not have a constant variance. If the scatter of residuals is unequal, the population used in the regression has unequal variance, then the conclusions of the study might be dubious.
Heteroscedasticity is responsible for the following two sorts of issues:
It does not cause coefficient estimates to be biased, but it also does make them less precise. With lower precision, the coefficient estimates are more likely to be off from the correct population value.
P-values tend to be smaller than they should be. This impact occurs because heteroscedasticity raises the variance of the coefficient estimates, which the OLS approach ignores.
model = LinearRegression()
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
# Display the intercept and coeficients of the feature(s) in the model computed with the SVD method by sklearn
print("Intercept: \n", model.intercept_)
print("Coefficients: \n",model.coef_)
Intercept: 22.27214818214255 Coefficients: [ 0.17029201 0.01574719 1.87321122 -2.45331073 5.94671804 2.81829013]
#Predicted Values vs True values
import hvplot.pandas
pd.DataFrame({'True values': y_val, 'Predicted values': pred}).hvplot.scatter(x='True values', y='Predicted values', title='Predicted Values vs True values')
results = pd.DataFrame(data=[['Linear Regression', *evaluation(y_val, pred), rmse_cross_validation(model), mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV', 'MAE_CV'])
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
#Residuals distribution
Residuals = pd.DataFrame({'Residuals': (pred - y_val)})
sns.boxplot(Residuals['Residuals'], color = 'r')
plt.title('Residuals Distribution', fontsize = 16)
plt.xlabel('Residuals', fontsize = 14)
plt.xticks(fontsize = 12)
plt.yticks(fontsize = 12)
plt.show()
#Residuals plot
from yellowbrick.regressor import ResidualsPlot
visualizer = ResidualsPlot(model, hist=False, qqplot=True)
visualizer.fit(partial_X_train,partial_y_train)
visualizer.score(X_val, y_val)
visualizer.show()
<AxesSubplot:title={'center':'Residuals for LinearRegression Model'}, xlabel='Predicted Value', ylabel='Residuals'>
The Residuals vs Fitted plot shows if residuals have non-linear patterns. There could be a non-linear relationship between predictor variables and an outcome variable and the pattern could show up in this plot if the model does not capture the non-linear relationship. In this case, the model seems to be close to linear. In general, we can find equally spread residuals around the horizontal line which is a good indication that we do not have significant non-linear relationships.
The Normal Q-Q plot shows that the residuals are normally distributed and they follow the straight line well.
Overfitting and multicollinearity are two issues that may arise as a result of the data training process. Regularization is one of the techniques that has been developed to help mitigate these events, consisting in adding a penalty term to the objective function. Regularization reduces the model's variance without considerably increasing its bias, resulting in more useful coefficient estimates.
Ridge Regression makes use of L2 regularisation which tries to minimize the objective function by adding a constrain term (λ, lambda) to the sum of the squares of coefficients. In other words, it imposes a penalty on the size of coefficients. The intercept term is not regularised. The constraint only applies to the sum of squares of the X's regression coefficients. Ridge Regression has the same assumptions as linear regression, with the exception that normality in the error terms. Ridge reduces the value of coefficients but does not reach zero, implying that no feature selection is taking place.
The objective function is: Min (∑ε² + λ∑β²) = Min ∑(y - (β0 + β1X1 + ... + βiXi))² + λ∑β²
model = Ridge()
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_2 = pd.DataFrame(data=[['Ridge Regression', *evaluation(y_val, pred), rmse_cross_validation(model), mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results = results.append(results_2, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624432 | 22.512782 | 4.744764 | 0.776836 | 4.655184 | 3.598095 |
# define model evaluation method
cv = RepeatedKFold(n_splits=5, n_repeats=3, random_state=42)
# define model
model = RidgeCV(alphas=np.arange(0, 1, 0.01), cv=cv, scoring='neg_mean_absolute_error')
# fit model
result = model.fit(X_train,y_train)
# summarize
print('MAE: %.3f' % result.best_score_)
print('Alpha: %s' % result.alpha_)
MAE: -3.598 Alpha: 0.99
model = Ridge(alpha=0.99)
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_2_1 = pd.DataFrame(data=[['Ridge Regression', *evaluation(y_val, pred), rmse_cross_validation(model), mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results = results.append(results_2_1, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624432 | 22.512782 | 4.744764 | 0.776836 | 4.655184 | 3.598095 |
| 2 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
results.drop(1, inplace = True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 2 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
Lasso stands for Least Absolute Shrinkage and Selection Operator. It makes use of L1 regularisation technique in the objective function. Lasso penalises the size of the regression coefficients in the same way as Ridge Regression does. Lasso regression varies from ridge regression in that it uses absolute values in the penalty function rather than squares, resulting in certain parameter estimates being exactly zero, which aids feature selection. Furthermore, by identifying a simpler model, it is capable of minimising the variability and enhancing the accuracy of linear regression models. This prevents the model from becoming overfit. When the independent variables are extremely collinear, Lasso regression selects only one variable and shrinks the others to zero.
The objective function is: Min (∑ε² + λ∑|β|) = Min ∑(y - (β0 + β1X1 + ... + βiXi))² + λ∑|β|
model = Lasso()
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_3 = pd.DataFrame(data=[['Lasso Regression', *evaluation(y_val, pred), rmse_cross_validation(model), mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV', 'MAE_CV'])
results = results.append(results_3, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.702809 | 24.192171 | 4.918554 | 0.760188 | 4.947995 | 3.780652 |
# define model evaluation method
cv = RepeatedKFold(n_splits=5, n_repeats=3, random_state=42)
# define model
model = LassoCV(alphas=np.arange(0, 1, 0.01), cv=cv, n_jobs=-1)
# fit model
result = model.fit(X_train,y_train)
# summarize
print('Alpha: %s' % result.alpha_)
Alpha: 0.12
model = Lasso(alpha=0.12)
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_3_1 = pd.DataFrame(data=[['Lasso Regression', *evaluation(y_val, pred), rmse_cross_validation(model),mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results = results.append(results_3_1, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.702809 | 24.192171 | 4.918554 | 0.760188 | 4.947995 | 3.780652 |
| 3 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
results.drop(2, inplace = True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 3 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
It is a combination of both L1 and L2 regularization.
The objective function in case of Elastic Net Regression is: Min (∑ε² + λ∑|β|) = Min ∑(y - (β0 + β1X1 + ... + βiXi))² λ∑β² + λ∑|β|
Like ridge and lasso regression, it does not assume normality.
model = ElasticNet(alpha=0.05, l1_ratio=0.5, random_state=42)
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_4 = pd.DataFrame(data=[['ElasticNet Regression', *evaluation(y_val, pred), rmse_cross_validation(model),mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results = results.append(results_4, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
| 3 | ElasticNet Regression | 3.586087 | 22.437620 | 4.736837 | 0.777581 | 4.654729 | 3.586794 |
# define model evaluation method
cv = RepeatedKFold(n_splits=5, n_repeats=3, random_state=42)
# define model
ratios = np.arange(0, 1, 0.01)
alphas = np.arange(0, 1, 0.01)
model = ElasticNetCV(l1_ratio=ratios, alphas=alphas, cv=cv, n_jobs=-1)
# fit model
model.fit(X_train,y_train)
# summarize chosen configuration
print('alpha: %f' % model.alpha_)
print('l1_ratio_: %f' % model.l1_ratio_)
alpha: 0.110000 l1_ratio_: 0.990000
model = ElasticNet(alpha=0.11, l1_ratio=0.99, random_state=42)
model.fit(partial_X_train,partial_y_train)
pred = model.predict(X_val)
results_4_1 = pd.DataFrame(data=[['ElasticNet Regression', *evaluation(y_val, pred), rmse_cross_validation(model),mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results = results.append(results_4_1, ignore_index=True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
| 3 | ElasticNet Regression | 3.586087 | 22.437620 | 4.736837 | 0.777581 | 4.654729 | 3.586794 |
| 4 | ElasticNet Regression | 3.583389 | 22.361371 | 4.728781 | 0.778337 | 4.645421 | 3.578348 |
I decided to use the Root Mean Squared Error (RMSE) metric for Cross-Validation to compare the models and pick the best one for the dataset in question. This decision was made because, as previously stated, the Cross-Validation approach allows the model to be trained in numerous train-test splits, making it more likely to perform well on unseen data. Furthermore, the fact that we split the dataset into multiple folds and train the algorithm on different folds prevents our model from overfitting the training dataset. Finally, another advantage of k-fold cross-validation is that each data point is tested exactly once and is used in training k-1 times. As k increases, the variance of the final estimate decreases, reducing bias significantly. As a result, the model achieves generalisation capabilities, which is an indication of a robust method.
Since we want to minimize the errors, and being RMSE a negatively-oriented score, lower values are better.
results.drop(3, inplace = True)
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
| 4 | ElasticNet Regression | 3.583389 | 22.361371 | 4.728781 | 0.778337 | 4.645421 | 3.578348 |
According to our loss function, RMSE Cross-Validation, the algorithm, Lasso Regression, is the model that best fit the data and also have the lowest RMSE value, 4.645083. Additionally, this model performs better in other measures, presenting the highest R-squared of 0.778435, indicating that almost all the variance in the Y variable is explained by the independent variables collectively, and the lowest MAE_CV, meaning that on average, the predicted Heating Load is far just 3.577086 compared to the true values.
Before training we need to transform our data into numpy arrays, our network will expect samples as vectors (1D arrays) of floating point numbers.
#transform data into tensors
partial_X_train = partial_X_train.to_numpy()
partial_y_train = partial_y_train.to_numpy()
X_val = X_val.to_numpy()
y_val = y_val.to_numpy()
X_test = X_test.to_numpy()
y_test = y_test.to_numpy()
The network ends with a single unit and no activation (it will be a linear layer). This is a typical setup for scalar regression ( a regression where we are trying to predict a single continues value). Applying an activation function would constrain the range the output can take; for instance, if we applied sigmoid activation function to the last layer, the network could only learn to predict values between 0 and 1.
from tensorflow.keras import models, layers
def build_baseline_model():
model = models.Sequential() #empty network which is a serie of tranformational layers. Data moves in a single direction
model.add(layers.Dense(64, activation='relu', input_shape=(partial_X_train.shape[1],)))
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dense(1))
model.compile(optimizer='rmsprop',loss='mse', metrics=['mae'])
return model
model = build_baseline_model()
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val),epochs=20, batch_size=16)
Train on 402 samples, validate on 135 samples Epoch 1/20
2022-04-08 22:49:47.766554: I tensorflow/core/platform/cpu_feature_guard.cc:145] This TensorFlow binary is optimized with Intel(R) MKL-DNN to use the following CPU instructions in performance critical operations: SSE4.1 SSE4.2 To enable them in non-MKL-DNN operations, rebuild TensorFlow with the appropriate compiler flags. 2022-04-08 22:49:47.767143: I tensorflow/core/common_runtime/process_util.cc:115] Creating new thread pool with default inter op setting: 8. Tune using inter_op_parallelism_threads for best performance.
402/402 [==============================] - 1s 2ms/sample - loss: 561.0986 - mae: 21.3789 - val_loss: 511.4028 - val_mae: 20.2058 Epoch 2/20 402/402 [==============================] - 0s 146us/sample - loss: 486.0700 - mae: 19.5455 - val_loss: 429.0735 - val_mae: 18.0983 Epoch 3/20 402/402 [==============================] - 0s 144us/sample - loss: 394.8838 - mae: 17.1574 - val_loss: 329.4149 - val_mae: 15.2555 Epoch 4/20 402/402 [==============================] - 0s 147us/sample - loss: 286.7209 - mae: 14.0078 - val_loss: 216.0114 - val_mae: 11.6639 Epoch 5/20 402/402 [==============================] - 0s 148us/sample - loss: 180.5016 - mae: 10.3156 - val_loss: 125.4987 - val_mae: 8.1333 Epoch 6/20 402/402 [==============================] - 0s 144us/sample - loss: 100.3045 - mae: 7.1170 - val_loss: 69.8422 - val_mae: 5.6390 Epoch 7/20 402/402 [==============================] - 0s 152us/sample - loss: 61.7946 - mae: 5.4788 - val_loss: 56.3158 - val_mae: 5.0921 Epoch 8/20 402/402 [==============================] - 0s 145us/sample - loss: 50.9661 - mae: 5.0778 - val_loss: 50.4499 - val_mae: 4.8829 Epoch 9/20 402/402 [==============================] - 0s 145us/sample - loss: 44.6740 - mae: 4.8039 - val_loss: 47.0355 - val_mae: 4.8610 Epoch 10/20 402/402 [==============================] - 0s 139us/sample - loss: 41.4923 - mae: 4.6432 - val_loss: 42.1504 - val_mae: 4.5398 Epoch 11/20 402/402 [==============================] - 0s 147us/sample - loss: 38.4074 - mae: 4.4466 - val_loss: 39.8994 - val_mae: 4.4802 Epoch 12/20 402/402 [==============================] - 0s 143us/sample - loss: 36.2370 - mae: 4.3490 - val_loss: 37.8592 - val_mae: 4.4109 Epoch 13/20 402/402 [==============================] - 0s 147us/sample - loss: 34.5203 - mae: 4.2412 - val_loss: 35.9943 - val_mae: 4.2831 Epoch 14/20 402/402 [==============================] - 0s 144us/sample - loss: 33.1591 - mae: 4.1997 - val_loss: 34.4058 - val_mae: 4.1557 Epoch 15/20 402/402 [==============================] - 0s 147us/sample - loss: 31.8394 - mae: 4.0801 - val_loss: 32.9263 - val_mae: 4.0638 Epoch 16/20 402/402 [==============================] - 0s 142us/sample - loss: 30.5119 - mae: 4.0125 - val_loss: 32.6725 - val_mae: 4.0670 Epoch 17/20 402/402 [==============================] - 0s 144us/sample - loss: 29.9056 - mae: 3.9927 - val_loss: 31.4606 - val_mae: 3.9437 Epoch 18/20 402/402 [==============================] - 0s 143us/sample - loss: 29.3176 - mae: 3.9451 - val_loss: 31.3158 - val_mae: 3.9198 Epoch 19/20 402/402 [==============================] - 0s 145us/sample - loss: 28.8677 - mae: 3.8633 - val_loss: 30.1936 - val_mae: 3.8690 Epoch 20/20 402/402 [==============================] - 0s 141us/sample - loss: 27.7837 - mae: 3.8282 - val_loss: 29.6201 - val_mae: 3.8142
history_dict = history.history
history_dict.keys()
dict_keys(['loss', 'mae', 'val_loss', 'val_mae'])
def plot_loss():
history_dict = history.history
loss = history_dict['loss']
val_loss = history_dict['val_loss']
epochs = range(1, len(loss) + 1)
blue_dots = 'bo'
solid_blue_line = 'b'
plt.plot(epochs, loss,blue_dots, label = 'Training loss')
plt.plot(epochs, val_loss, solid_blue_line, label = 'Validation loss')
plt.title('Training and Validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.show()
plot_loss()
def plot_metric():
history_dict = history.history
mae = history_dict['mae']
val_mae = history_dict['val_mae']
epochs = range(1, len(mae) + 1)
blue_dots = 'bo'
solid_blue_line = 'b'
plt.plot(epochs, mae, blue_dots, label = 'Training MAE')
plt.plot(epochs, val_mae, solid_blue_line, label = 'Validation MAE')
plt.title('Training and Validation MAE')
plt.xlabel('Epochs')
plt.ylabel('MAE')
plt.legend()
plt.show()
plot_metric()
print('MSE: %f' % model.evaluate(X_val, y_val, verbose=0)[0])
print('MAE: %f' % model.evaluate(X_val, y_val, verbose=0)[1])
MSE: 29.620142 MAE: 3.814236
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val),
epochs=500, batch_size=16, verbose=0)
mae_history = history.history['val_mae']
plt.plot(range(1, len(mae_history)+1), mae_history)
plt.xlabel('Epochs')
plt.ylabel('Validation MAE')
plt.show()
It may be a little bit difficult to see this plot , due to scaling issues and relatively high variance. So we going to:
def smooth_curve(points,factor=0.9):
smoothed_points=[]
for point in points:
if smoothed_points:
previous = smoothed_points[-1]
smoothed_points.append(previous*factor + point *(1-factor))
else:
smoothed_points.append(point)
return smoothed_points
smoothed_mae_history = smooth_curve(mae_history[10:])
plt.plot(range(1,len(smoothed_mae_history)+1), smoothed_mae_history)
plt.xlabel('Epochs')
plt.ylabel('Validation MAE')
plt.show()
According to this plot, validation MAE stops improving significantly after 100 epochs. Past that point, we start overfitting
With SGD, momemtum addresses two issues: convergence speed and local minima (stuck point). Gradient descent works in one direction to find the global minimum (derivative = 0), which is the set of weight values that produces the least loss function feasible. During this process, optimization may become trapped at a local minimum rather than progressing to a global minimum. The parameter W (dot(w,input)+b) is updated by momentum based on both the current gradient (derivative) value and the prior update.
from tensorflow.keras import optimizers
def build_model():
model = models.Sequential()
model.add(layers.Dense(64, activation="selu", kernel_initializer="lecun_normal",
input_shape=(partial_X_train.shape[1],)))
model.add(layers.Dense(32, activation="selu", kernel_initializer="lecun_normal"))
model.add(layers.Dense(1))
model.compile(optimizer=optimizers.RMSprop(lr=0.001, momentum=0.9),
loss='mse', metrics=['mae'])
return model
model = build_model()
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val), epochs=100,
batch_size=32,verbose=0)
plot_loss()
print('MSE: %f' % model.evaluate(X_val, y_val, verbose=0)[0])
print('MAE: %f' % model.evaluate(X_val, y_val, verbose=0)[1])
MSE: 14.364780 MAE: 2.590615
When compared to the results of our base model, the MAE and MSE values from this model after the introduction of momentum and the changing of the activation function from RELU to SELU show a significant improvement. SELU achieves self-normalization of the network, the output of each layer tends to preserve mean 0 and standard deviation 1, and it helps with vanishing gradients, when gradients are small or zero, and the weights and biases of the initial layers are not updated effectively with each training session.
In machine learning problems, if you have a choice between two models, one sophisticated and the other much simpler, we should choose the simpler one if the simplest model represents the data as well as the complex model. Overfitting is less common with simpler models. If parameters fill a tiny interval near to zero, they are regular. Large weigh parameters will amplify the noise and the network will be atempt to the noise. A model with smaller parameters is more robust.Putting constraints on a network's complexity by limiting its weights to take more regular values is a frequent approach to avoid overfitiing. This is known as weight regularisation, and it is accomplished by including a cost associated with large weights in the network's loss function.
For obvious reasons, L1 regularisation is more resilient than L2 regularisation. L2 regularization takes the square of the weights, so the cost of outliers present in the data increases exponentially. L1 regularization takes the absolute values of the weights, so the cost only increases linearly. As previously said, Lasso was the best performing linear regression model; this model uses L1 regularization, thus we will use it for our neural networks model as well.
from tensorflow.keras import regularizers
def build_model():
model = models.Sequential()
model.add(layers.Dense(64, kernel_regularizer=regularizers.l1(0.001),activation="selu", kernel_initializer="lecun_normal",
input_shape=(partial_X_train.shape[1],)))
model.add(layers.Dense(32, activation="selu", kernel_initializer="lecun_normal"))
model.add(layers.Dense(1))
model.compile(optimizer=optimizers.RMSprop(lr=0.001, momentum=0.7),
loss='mse', metrics=['mae'])
return model
model = build_model()
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val), epochs=100,
batch_size=64,verbose=0)
plot_loss()
print('MSE: %f' % model.evaluate(X_val, y_val, verbose=0)[0])
print('MAE: %f' % model.evaluate(X_val, y_val, verbose=0)[1])
MSE: 14.238459 MAE: 2.630859
When compared to the results of our last model, the MAE and MSE values from this model after the introduction of L1 regularization show a sightly improvement.
Batch normalization is a type of layer, it can adaptively normalize the data even as the mean and variance change over time during training. It works by internally maintaining an exponential moving average of the batch-wise mean and variance of the data seen during training. The main effect of batch normalisation is that it helps with gradient propagation, much like residual connection, and thus allows for deeper networks. The BatchNormalization layer takes an axis argument which specifies the feature axis that should be normalised. This argument defaults to -1, the last axis in the input tensor
from tensorflow.keras import optimizers
def build_model():
model = models.Sequential()
model.add(layers.Dense(64, kernel_regularizer=regularizers.l1(0.001), activation="selu",
kernel_initializer="lecun_normal",
input_shape=(partial_X_train.shape[1],)))
keras.layers.BatchNormalization()
model.add(layers.Dense(32, activation="selu", kernel_initializer="lecun_normal"))
model.add(layers.Dense(1))
model.compile(optimizer=optimizers.RMSprop(lr=0.001, momentum=0.7),
loss='mse', metrics=['mae'])
return model
model = build_model()
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val), epochs=100,
batch_size = 64, verbose=0)
plot_loss()
print('MSE: %f' % model.evaluate(X_val, y_val, verbose=0)[0])
print('MAE: %f' % model.evaluate(X_val, y_val, verbose=0)[1])
MSE: 13.576199 MAE: 2.571613
When compared to the results of our last model, the MAE and MSE values from this model after the introduction of batch normalization show a sightly improvement.
For several neural networks, dropout is one of the most successful and widely used regularisation strategies. During training, dropout involves arbitrarily drooping out, or setting to zero, a number of the layer's output features.The dropout rate is the fraction of features that are zerout. It is normally in the range of 0.2 to 0.5. The idea behind it is that by introducing noise in the output values of a layer can break up insignificant patterns that the network would start memorising if there was no noise.
def build_model():
model = models.Sequential()
model.add(layers.Dense(64, kernel_regularizer=regularizers.l1(0.001), activation="selu",
kernel_initializer="lecun_normal",
input_shape=(partial_X_train.shape[1],)))
keras.layers.BatchNormalization()
model.add(layers.Dense(32, activation="selu", kernel_initializer="lecun_normal"))
keras.layers.Dropout(rate=0.3)
model.add(layers.Dense(1))
model.compile(optimizer=optimizers.RMSprop(lr=0.001, momentum=0.7),
loss='mse', metrics=['mae'])
return model
model = build_model()
history = model.fit(partial_X_train,partial_y_train, validation_data=(X_val, y_val), epochs=100,
batch_size=64,verbose=0)
plot_loss()
print('MSE: %f' % model.evaluate(X_val, y_val, verbose=0)[0])
print('MAE: %f' % model.evaluate(X_val, y_val, verbose=0)[1])
MSE: 13.852507 MAE: 2.557960
With the addition of dropout, we see that the results were sightly worse than the previous model. So, I will tune this model without dropout.
# Define a function producing a neural net model
def build_model(n_hidden=1, n_neurons=32, learning_rate=3e-3, input_shape=(partial_X_train.shape[1],)):
model = keras.models.Sequential()
model.add(keras.layers.InputLayer(input_shape=input_shape))
for layer in range(n_hidden):
model.add(keras.layers.Dense(n_neurons, kernel_regularizer=regularizers.l1(0.001), activation="selu",
kernel_initializer="lecun_normal"))
keras.layers.BatchNormalization()
model.add(layers.Dense(1))
optimizer = optimizers.RMSprop(learning_rate=learning_rate, decay=1e-4, momentum=0.7)
model.compile(loss="mse", optimizer=optimizer, metrics=['mae'])
return model
keras_reg = keras.wrappers.scikit_learn.KerasRegressor(build_model)
Callbacks
# Tuning the neural net model in a cross validation.
# Fit the model with early stopping.
from scipy.stats import reciprocal
from sklearn.model_selection import RandomizedSearchCV
param_distribs = {
"n_hidden": [1, 2, 3,5],
"n_neurons": np.arange(1, 100) .tolist(),
"learning_rate": reciprocal(1e-3, 3e-2) .rvs(1000).tolist(),
}
rnd_search_cv = RandomizedSearchCV(keras_reg, param_distribs, n_iter=10, cv=4, verbose=2)
rnd_search_cv.fit(partial_X_train,partial_y_train, epochs=200, #200 epochs cause we use EarlyStopping
validation_data=(X_val, y_val),
callbacks=[keras.callbacks.EarlyStopping(patience=10)])
Fitting 4 folds for each of 10 candidates, totalling 40 fits Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 0s 2ms/sample - loss: 561.0374 - mae: 22.0515 - val_loss: 482.4188 - val_mae: 20.8646 Epoch 2/200 301/301 [==============================] - 0s 129us/sample - loss: 467.4264 - mae: 20.5666 - val_loss: 400.0876 - val_mae: 19.2288 Epoch 3/200 301/301 [==============================] - 0s 128us/sample - loss: 380.3731 - mae: 18.6318 - val_loss: 312.8671 - val_mae: 16.9395 Epoch 4/200 301/301 [==============================] - 0s 138us/sample - loss: 288.9610 - mae: 16.0345 - val_loss: 224.8756 - val_mae: 14.1083 Epoch 5/200 301/301 [==============================] - 0s 127us/sample - loss: 200.3149 - mae: 13.0317 - val_loss: 142.8232 - val_mae: 10.9357 Epoch 6/200 301/301 [==============================] - 0s 131us/sample - loss: 123.7894 - mae: 9.7518 - val_loss: 80.0710 - val_mae: 7.8038 Epoch 7/200 301/301 [==============================] - 0s 137us/sample - loss: 69.4171 - mae: 6.8137 - val_loss: 42.3690 - val_mae: 5.2689 Epoch 8/200 301/301 [==============================] - 0s 131us/sample - loss: 39.3576 - mae: 4.9203 - val_loss: 27.3156 - val_mae: 3.7889 Epoch 9/200 301/301 [==============================] - 0s 131us/sample - loss: 27.9835 - mae: 4.0033 - val_loss: 24.0974 - val_mae: 3.5587 Epoch 10/200 301/301 [==============================] - 0s 125us/sample - loss: 25.3364 - mae: 3.8535 - val_loss: 23.3740 - val_mae: 3.5478 Epoch 11/200 301/301 [==============================] - 0s 126us/sample - loss: 23.5267 - mae: 3.6728 - val_loss: 23.1019 - val_mae: 3.5656 Epoch 12/200 301/301 [==============================] - 0s 127us/sample - loss: 22.4348 - mae: 3.6050 - val_loss: 22.2307 - val_mae: 3.5041 Epoch 13/200 301/301 [==============================] - 0s 122us/sample - loss: 21.4884 - mae: 3.4578 - val_loss: 22.8705 - val_mae: 3.6128 Epoch 14/200 301/301 [==============================] - 0s 121us/sample - loss: 21.2974 - mae: 3.4205 - val_loss: 21.3810 - val_mae: 3.4125 Epoch 15/200 301/301 [==============================] - 0s 125us/sample - loss: 20.0586 - mae: 3.3080 - val_loss: 20.7045 - val_mae: 3.3237 Epoch 16/200 301/301 [==============================] - 0s 128us/sample - loss: 19.4661 - mae: 3.2583 - val_loss: 20.7843 - val_mae: 3.4198 Epoch 17/200 301/301 [==============================] - 0s 129us/sample - loss: 19.1185 - mae: 3.2650 - val_loss: 20.4084 - val_mae: 3.2703 Epoch 18/200 301/301 [==============================] - 0s 126us/sample - loss: 18.7321 - mae: 3.2371 - val_loss: 20.7457 - val_mae: 3.3727 Epoch 19/200 301/301 [==============================] - 0s 131us/sample - loss: 18.9362 - mae: 3.1912 - val_loss: 20.6481 - val_mae: 3.3921 Epoch 20/200 301/301 [==============================] - 0s 128us/sample - loss: 18.4559 - mae: 3.2064 - val_loss: 21.2542 - val_mae: 3.5361 Epoch 21/200 301/301 [==============================] - 0s 128us/sample - loss: 17.8759 - mae: 3.0805 - val_loss: 19.9272 - val_mae: 3.2629 Epoch 22/200 301/301 [==============================] - 0s 125us/sample - loss: 17.7329 - mae: 3.1332 - val_loss: 19.6428 - val_mae: 3.2437 Epoch 23/200 301/301 [==============================] - 0s 127us/sample - loss: 17.7525 - mae: 3.0967 - val_loss: 20.0242 - val_mae: 3.3453 Epoch 24/200 301/301 [==============================] - 0s 130us/sample - loss: 17.2042 - mae: 3.0831 - val_loss: 19.9073 - val_mae: 3.2836 Epoch 25/200 301/301 [==============================] - 0s 124us/sample - loss: 17.3080 - mae: 3.0570 - val_loss: 19.1469 - val_mae: 3.2799 Epoch 26/200 301/301 [==============================] - 0s 128us/sample - loss: 16.8961 - mae: 3.0788 - val_loss: 19.2458 - val_mae: 3.2407 Epoch 27/200 301/301 [==============================] - 0s 122us/sample - loss: 17.2225 - mae: 3.0382 - val_loss: 19.2885 - val_mae: 3.3007 Epoch 28/200 301/301 [==============================] - 0s 122us/sample - loss: 16.5377 - mae: 3.0169 - val_loss: 19.0886 - val_mae: 3.2606 Epoch 29/200 301/301 [==============================] - 0s 126us/sample - loss: 16.2515 - mae: 2.9689 - val_loss: 18.8714 - val_mae: 3.2385 Epoch 30/200 301/301 [==============================] - 0s 123us/sample - loss: 16.2414 - mae: 2.9985 - val_loss: 18.7422 - val_mae: 3.2437 Epoch 31/200 301/301 [==============================] - 0s 128us/sample - loss: 15.9050 - mae: 2.9946 - val_loss: 19.4360 - val_mae: 3.3607 Epoch 32/200 301/301 [==============================] - 0s 130us/sample - loss: 15.9751 - mae: 2.9940 - val_loss: 18.8137 - val_mae: 3.2411 Epoch 33/200 301/301 [==============================] - 0s 127us/sample - loss: 16.0104 - mae: 2.9682 - val_loss: 18.4577 - val_mae: 3.2451 Epoch 34/200 301/301 [==============================] - 0s 131us/sample - loss: 15.5058 - mae: 2.9516 - val_loss: 18.3212 - val_mae: 3.2020 Epoch 35/200 301/301 [==============================] - 0s 124us/sample - loss: 15.4775 - mae: 2.9233 - val_loss: 17.9078 - val_mae: 3.1651 Epoch 36/200 301/301 [==============================] - 0s 123us/sample - loss: 15.1212 - mae: 2.9248 - val_loss: 21.5664 - val_mae: 3.6006 Epoch 37/200 301/301 [==============================] - 0s 129us/sample - loss: 15.7185 - mae: 2.9716 - val_loss: 17.9311 - val_mae: 3.1962 Epoch 38/200 301/301 [==============================] - 0s 125us/sample - loss: 15.1351 - mae: 2.8889 - val_loss: 17.8378 - val_mae: 3.1419 Epoch 39/200 301/301 [==============================] - 0s 123us/sample - loss: 14.9393 - mae: 2.8819 - val_loss: 17.9882 - val_mae: 3.2067 Epoch 40/200 301/301 [==============================] - 0s 122us/sample - loss: 14.5221 - mae: 2.8440 - val_loss: 17.7173 - val_mae: 3.1585 Epoch 41/200 301/301 [==============================] - 0s 127us/sample - loss: 14.5292 - mae: 2.8441 - val_loss: 17.5852 - val_mae: 3.1578 Epoch 42/200 301/301 [==============================] - 0s 130us/sample - loss: 14.4104 - mae: 2.8244 - val_loss: 17.1422 - val_mae: 3.0896 Epoch 43/200 301/301 [==============================] - 0s 128us/sample - loss: 14.4292 - mae: 2.8432 - val_loss: 17.1394 - val_mae: 3.1046 Epoch 44/200 301/301 [==============================] - 0s 124us/sample - loss: 14.2369 - mae: 2.8005 - val_loss: 17.6416 - val_mae: 3.1903 Epoch 45/200 301/301 [==============================] - 0s 124us/sample - loss: 14.8422 - mae: 2.8356 - val_loss: 16.8288 - val_mae: 3.0893 Epoch 46/200 301/301 [==============================] - 0s 124us/sample - loss: 13.9961 - mae: 2.7424 - val_loss: 16.6458 - val_mae: 3.0327 Epoch 47/200 301/301 [==============================] - 0s 131us/sample - loss: 13.7729 - mae: 2.7609 - val_loss: 17.0804 - val_mae: 3.1065 Epoch 48/200 301/301 [==============================] - 0s 125us/sample - loss: 13.7018 - mae: 2.7660 - val_loss: 16.8179 - val_mae: 3.0892 Epoch 49/200 301/301 [==============================] - 0s 125us/sample - loss: 13.5676 - mae: 2.7227 - val_loss: 16.9621 - val_mae: 3.0946 Epoch 50/200 301/301 [==============================] - 0s 120us/sample - loss: 13.9791 - mae: 2.8134 - val_loss: 16.5728 - val_mae: 3.0465 Epoch 51/200 301/301 [==============================] - 0s 126us/sample - loss: 13.4881 - mae: 2.7456 - val_loss: 16.4906 - val_mae: 3.0461 Epoch 52/200 301/301 [==============================] - 0s 127us/sample - loss: 13.5835 - mae: 2.7415 - val_loss: 16.4801 - val_mae: 3.0509 Epoch 53/200 301/301 [==============================] - 0s 131us/sample - loss: 13.6456 - mae: 2.7383 - val_loss: 16.2777 - val_mae: 3.0182 Epoch 54/200 301/301 [==============================] - 0s 127us/sample - loss: 13.0961 - mae: 2.7063 - val_loss: 16.9686 - val_mae: 3.1277 Epoch 55/200 301/301 [==============================] - 0s 124us/sample - loss: 13.0568 - mae: 2.6909 - val_loss: 16.3522 - val_mae: 3.0388 Epoch 56/200 301/301 [==============================] - 0s 123us/sample - loss: 13.2103 - mae: 2.7172 - val_loss: 16.0988 - val_mae: 3.0075 Epoch 57/200 301/301 [==============================] - 0s 131us/sample - loss: 13.1666 - mae: 2.7061 - val_loss: 16.9974 - val_mae: 3.0776 Epoch 58/200 301/301 [==============================] - 0s 125us/sample - loss: 13.2884 - mae: 2.7064 - val_loss: 16.5006 - val_mae: 3.0217 Epoch 59/200 301/301 [==============================] - 0s 124us/sample - loss: 13.0648 - mae: 2.6818 - val_loss: 15.9226 - val_mae: 3.0042 Epoch 60/200 301/301 [==============================] - 0s 128us/sample - loss: 12.7966 - mae: 2.6548 - val_loss: 15.8059 - val_mae: 2.9869 Epoch 61/200 301/301 [==============================] - 0s 125us/sample - loss: 12.7328 - mae: 2.6299 - val_loss: 16.2347 - val_mae: 3.0263 Epoch 62/200 301/301 [==============================] - 0s 125us/sample - loss: 12.7775 - mae: 2.6316 - val_loss: 15.9741 - val_mae: 3.0020 Epoch 63/200 301/301 [==============================] - 0s 127us/sample - loss: 12.5232 - mae: 2.6125 - val_loss: 16.7520 - val_mae: 3.0341 Epoch 64/200 301/301 [==============================] - 0s 127us/sample - loss: 12.4166 - mae: 2.6248 - val_loss: 16.2870 - val_mae: 3.0667 Epoch 65/200 301/301 [==============================] - 0s 123us/sample - loss: 12.7824 - mae: 2.6725 - val_loss: 15.8838 - val_mae: 2.9631 Epoch 66/200 301/301 [==============================] - 0s 125us/sample - loss: 12.2830 - mae: 2.5938 - val_loss: 15.7316 - val_mae: 2.9914 Epoch 67/200 301/301 [==============================] - 0s 124us/sample - loss: 12.3744 - mae: 2.6060 - val_loss: 15.4273 - val_mae: 2.9589 Epoch 68/200 301/301 [==============================] - 0s 127us/sample - loss: 12.5799 - mae: 2.6289 - val_loss: 15.8775 - val_mae: 2.9771 Epoch 69/200 301/301 [==============================] - 0s 122us/sample - loss: 12.3652 - mae: 2.6227 - val_loss: 16.0524 - val_mae: 2.9946 Epoch 70/200 301/301 [==============================] - 0s 126us/sample - loss: 12.2285 - mae: 2.5781 - val_loss: 15.7989 - val_mae: 3.0192 Epoch 71/200 301/301 [==============================] - 0s 125us/sample - loss: 12.2730 - mae: 2.5861 - val_loss: 15.5483 - val_mae: 2.9382 Epoch 72/200 301/301 [==============================] - 0s 125us/sample - loss: 12.2946 - mae: 2.5980 - val_loss: 15.7222 - val_mae: 2.9570 Epoch 73/200 301/301 [==============================] - 0s 127us/sample - loss: 12.0517 - mae: 2.6173 - val_loss: 18.1441 - val_mae: 3.1780 Epoch 74/200 301/301 [==============================] - 0s 124us/sample - loss: 12.3133 - mae: 2.6103 - val_loss: 15.3993 - val_mae: 2.9691 Epoch 75/200 301/301 [==============================] - 0s 124us/sample - loss: 12.3186 - mae: 2.6382 - val_loss: 15.5224 - val_mae: 2.9004 Epoch 76/200 301/301 [==============================] - 0s 122us/sample - loss: 12.2242 - mae: 2.5804 - val_loss: 15.1272 - val_mae: 2.8996 Epoch 77/200 301/301 [==============================] - 0s 123us/sample - loss: 11.8596 - mae: 2.5518 - val_loss: 15.5278 - val_mae: 2.9865 Epoch 78/200 301/301 [==============================] - 0s 126us/sample - loss: 12.1934 - mae: 2.6008 - val_loss: 15.6972 - val_mae: 2.9239 Epoch 79/200 301/301 [==============================] - 0s 121us/sample - loss: 11.7916 - mae: 2.5489 - val_loss: 15.3534 - val_mae: 2.9577 Epoch 80/200 301/301 [==============================] - 0s 121us/sample - loss: 11.8649 - mae: 2.5351 - val_loss: 15.6055 - val_mae: 2.9134 Epoch 81/200 301/301 [==============================] - 0s 123us/sample - loss: 12.0926 - mae: 2.5723 - val_loss: 15.2255 - val_mae: 2.9426 Epoch 82/200 301/301 [==============================] - 0s 123us/sample - loss: 11.8684 - mae: 2.5571 - val_loss: 16.4887 - val_mae: 2.9584 Epoch 83/200 301/301 [==============================] - 0s 127us/sample - loss: 12.0999 - mae: 2.5670 - val_loss: 15.1375 - val_mae: 2.9162 Epoch 84/200 301/301 [==============================] - 0s 125us/sample - loss: 11.6226 - mae: 2.5239 - val_loss: 15.1185 - val_mae: 2.9081 Epoch 85/200 301/301 [==============================] - 0s 128us/sample - loss: 11.8152 - mae: 2.5226 - val_loss: 15.1824 - val_mae: 2.9215 Epoch 86/200 301/301 [==============================] - 0s 131us/sample - loss: 11.6603 - mae: 2.5473 - val_loss: 15.5520 - val_mae: 2.8945 Epoch 87/200 301/301 [==============================] - 0s 129us/sample - loss: 11.5635 - mae: 2.5117 - val_loss: 15.2660 - val_mae: 2.9513 Epoch 88/200 301/301 [==============================] - 0s 132us/sample - loss: 11.6791 - mae: 2.5407 - val_loss: 14.8987 - val_mae: 2.8316 Epoch 89/200 301/301 [==============================] - 0s 124us/sample - loss: 11.6516 - mae: 2.5164 - val_loss: 14.9519 - val_mae: 2.8623 Epoch 90/200 301/301 [==============================] - 0s 121us/sample - loss: 11.6628 - mae: 2.5005 - val_loss: 14.7854 - val_mae: 2.8643 Epoch 91/200 301/301 [==============================] - 0s 125us/sample - loss: 11.3320 - mae: 2.4753 - val_loss: 15.0703 - val_mae: 2.9247 Epoch 92/200 301/301 [==============================] - 0s 127us/sample - loss: 11.7888 - mae: 2.5435 - val_loss: 15.2939 - val_mae: 2.9766 Epoch 93/200 301/301 [==============================] - 0s 124us/sample - loss: 11.4049 - mae: 2.4981 - val_loss: 15.0064 - val_mae: 2.8594 Epoch 94/200 301/301 [==============================] - 0s 126us/sample - loss: 11.2741 - mae: 2.4693 - val_loss: 15.8086 - val_mae: 2.9086 Epoch 95/200 301/301 [==============================] - 0s 127us/sample - loss: 11.3552 - mae: 2.4754 - val_loss: 15.0517 - val_mae: 2.9061 Epoch 96/200 301/301 [==============================] - 0s 124us/sample - loss: 11.3903 - mae: 2.4785 - val_loss: 15.2074 - val_mae: 2.9221 Epoch 97/200 301/301 [==============================] - 0s 121us/sample - loss: 11.2574 - mae: 2.4693 - val_loss: 15.3038 - val_mae: 2.8844 Epoch 98/200 301/301 [==============================] - 0s 126us/sample - loss: 11.3323 - mae: 2.5096 - val_loss: 15.7899 - val_mae: 2.9121 Epoch 99/200 301/301 [==============================] - 0s 124us/sample - loss: 11.3043 - mae: 2.4687 - val_loss: 14.8477 - val_mae: 2.8698 Epoch 100/200 301/301 [==============================] - 0s 122us/sample - loss: 11.2480 - mae: 2.4804 - val_loss: 15.4727 - val_mae: 2.8889 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 38us/sample - loss: 14.6312 - mae: 2.5942 [CV] END learning_rate=0.002433174108561598, n_hidden=1, n_neurons=34; total time= 4.4s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 553.7652 - mae: 21.8853 - val_loss: 501.8693 - val_mae: 21.0859 Epoch 2/200 301/301 [==============================] - 0s 123us/sample - loss: 477.6636 - mae: 20.6887 - val_loss: 427.1739 - val_mae: 19.6550 Epoch 3/200 301/301 [==============================] - 0s 127us/sample - loss: 398.6311 - mae: 19.0193 - val_loss: 341.1554 - val_mae: 17.6073 Epoch 4/200 301/301 [==============================] - 0s 125us/sample - loss: 308.6167 - mae: 16.7019 - val_loss: 250.7002 - val_mae: 14.9689 Epoch 5/200 301/301 [==============================] - 0s 125us/sample - loss: 217.2913 - mae: 13.7928 - val_loss: 165.0179 - val_mae: 11.8654 Epoch 6/200 301/301 [==============================] - 0s 122us/sample - loss: 137.1706 - mae: 10.6170 - val_loss: 94.6331 - val_mae: 8.5305 Epoch 7/200 301/301 [==============================] - 0s 125us/sample - loss: 75.0811 - mae: 7.3175 - val_loss: 48.5169 - val_mae: 5.6816 Epoch 8/200 301/301 [==============================] - 0s 122us/sample - loss: 38.2978 - mae: 4.8326 - val_loss: 26.6033 - val_mae: 3.8422 Epoch 9/200 301/301 [==============================] - 0s 124us/sample - loss: 24.1820 - mae: 3.7825 - val_loss: 21.0605 - val_mae: 3.3864 Epoch 10/200 301/301 [==============================] - 0s 128us/sample - loss: 21.1224 - mae: 3.5268 - val_loss: 20.5660 - val_mae: 3.3478 Epoch 11/200 301/301 [==============================] - 0s 125us/sample - loss: 20.5144 - mae: 3.4617 - val_loss: 20.2834 - val_mae: 3.2883 Epoch 12/200 301/301 [==============================] - 0s 125us/sample - loss: 19.8445 - mae: 3.4024 - val_loss: 19.9427 - val_mae: 3.2969 Epoch 13/200 301/301 [==============================] - 0s 123us/sample - loss: 19.1416 - mae: 3.2959 - val_loss: 19.5830 - val_mae: 3.2493 Epoch 14/200 301/301 [==============================] - 0s 125us/sample - loss: 18.5987 - mae: 3.2280 - val_loss: 19.4755 - val_mae: 3.2780 Epoch 15/200 301/301 [==============================] - 0s 124us/sample - loss: 18.1985 - mae: 3.2415 - val_loss: 19.7363 - val_mae: 3.3055 Epoch 16/200 301/301 [==============================] - 0s 122us/sample - loss: 17.8937 - mae: 3.1730 - val_loss: 19.0556 - val_mae: 3.2831 Epoch 17/200 301/301 [==============================] - 0s 121us/sample - loss: 17.3121 - mae: 3.1203 - val_loss: 18.7242 - val_mae: 3.2255 Epoch 18/200 301/301 [==============================] - 0s 121us/sample - loss: 17.7289 - mae: 3.1701 - val_loss: 18.7858 - val_mae: 3.2200 Epoch 19/200 301/301 [==============================] - 0s 126us/sample - loss: 16.9832 - mae: 3.0467 - val_loss: 18.7696 - val_mae: 3.2255 Epoch 20/200 301/301 [==============================] - 0s 122us/sample - loss: 17.1659 - mae: 3.1214 - val_loss: 18.8042 - val_mae: 3.2562 Epoch 21/200 301/301 [==============================] - 0s 129us/sample - loss: 16.4500 - mae: 3.0238 - val_loss: 20.1124 - val_mae: 3.3389 Epoch 22/200 301/301 [==============================] - 0s 124us/sample - loss: 17.6358 - mae: 3.1887 - val_loss: 20.0041 - val_mae: 3.3581 Epoch 23/200 301/301 [==============================] - 0s 128us/sample - loss: 16.5520 - mae: 2.9965 - val_loss: 18.0351 - val_mae: 3.1682 Epoch 24/200 301/301 [==============================] - 0s 122us/sample - loss: 16.3021 - mae: 3.0126 - val_loss: 18.1887 - val_mae: 3.1814 Epoch 25/200 301/301 [==============================] - 0s 121us/sample - loss: 16.0756 - mae: 2.9773 - val_loss: 17.5533 - val_mae: 3.1037 Epoch 26/200 301/301 [==============================] - 0s 123us/sample - loss: 16.0928 - mae: 2.9663 - val_loss: 17.9634 - val_mae: 3.1289 Epoch 27/200 301/301 [==============================] - 0s 125us/sample - loss: 16.2698 - mae: 3.0157 - val_loss: 17.5642 - val_mae: 3.1063 Epoch 28/200 301/301 [==============================] - 0s 123us/sample - loss: 15.4724 - mae: 2.9506 - val_loss: 17.3212 - val_mae: 3.0923 Epoch 29/200 301/301 [==============================] - 0s 121us/sample - loss: 15.3417 - mae: 2.9225 - val_loss: 17.3329 - val_mae: 3.0926 Epoch 30/200 301/301 [==============================] - 0s 129us/sample - loss: 15.3590 - mae: 2.9085 - val_loss: 17.1692 - val_mae: 3.0538 Epoch 31/200 301/301 [==============================] - 0s 121us/sample - loss: 15.3008 - mae: 2.9508 - val_loss: 16.9827 - val_mae: 3.0485 Epoch 32/200 301/301 [==============================] - 0s 120us/sample - loss: 15.3288 - mae: 2.9086 - val_loss: 16.6342 - val_mae: 3.0014 Epoch 33/200 301/301 [==============================] - 0s 121us/sample - loss: 14.7032 - mae: 2.8678 - val_loss: 16.8193 - val_mae: 3.0173 Epoch 34/200 301/301 [==============================] - 0s 127us/sample - loss: 14.5504 - mae: 2.8291 - val_loss: 17.2827 - val_mae: 3.0797 Epoch 35/200 301/301 [==============================] - 0s 125us/sample - loss: 14.6160 - mae: 2.8729 - val_loss: 16.8913 - val_mae: 3.0378 Epoch 36/200 301/301 [==============================] - 0s 123us/sample - loss: 14.4692 - mae: 2.8614 - val_loss: 16.5014 - val_mae: 2.9947 Epoch 37/200 301/301 [==============================] - 0s 124us/sample - loss: 14.2894 - mae: 2.8242 - val_loss: 16.1010 - val_mae: 2.9579 Epoch 38/200 301/301 [==============================] - 0s 125us/sample - loss: 14.4016 - mae: 2.8782 - val_loss: 16.3284 - val_mae: 2.9936 Epoch 39/200 301/301 [==============================] - 0s 125us/sample - loss: 14.1641 - mae: 2.8471 - val_loss: 16.4528 - val_mae: 2.9934 Epoch 40/200 301/301 [==============================] - 0s 121us/sample - loss: 14.0289 - mae: 2.7992 - val_loss: 16.0553 - val_mae: 2.9505 Epoch 41/200 301/301 [==============================] - 0s 125us/sample - loss: 13.9184 - mae: 2.7897 - val_loss: 15.8337 - val_mae: 2.9169 Epoch 42/200 301/301 [==============================] - 0s 126us/sample - loss: 13.6890 - mae: 2.7649 - val_loss: 15.5991 - val_mae: 2.9159 Epoch 43/200 301/301 [==============================] - 0s 121us/sample - loss: 13.9029 - mae: 2.7996 - val_loss: 15.6777 - val_mae: 2.9094 Epoch 44/200 301/301 [==============================] - 0s 123us/sample - loss: 13.6696 - mae: 2.7756 - val_loss: 15.5494 - val_mae: 2.9074 Epoch 45/200 301/301 [==============================] - 0s 124us/sample - loss: 13.5342 - mae: 2.7741 - val_loss: 15.7018 - val_mae: 2.9251 Epoch 46/200 301/301 [==============================] - 0s 126us/sample - loss: 13.6622 - mae: 2.7782 - val_loss: 16.0312 - val_mae: 2.9620 Epoch 47/200 301/301 [==============================] - 0s 122us/sample - loss: 13.4223 - mae: 2.7262 - val_loss: 15.5092 - val_mae: 2.8869 Epoch 48/200 301/301 [==============================] - 0s 127us/sample - loss: 13.3060 - mae: 2.7253 - val_loss: 15.5132 - val_mae: 2.8769 Epoch 49/200 301/301 [==============================] - 0s 125us/sample - loss: 13.4176 - mae: 2.7764 - val_loss: 15.5348 - val_mae: 2.8907 Epoch 50/200 301/301 [==============================] - 0s 123us/sample - loss: 13.2100 - mae: 2.7630 - val_loss: 15.5680 - val_mae: 2.9164 Epoch 51/200 301/301 [==============================] - 0s 124us/sample - loss: 13.6302 - mae: 2.7931 - val_loss: 15.7178 - val_mae: 2.8834 Epoch 52/200 301/301 [==============================] - 0s 125us/sample - loss: 13.6975 - mae: 2.7907 - val_loss: 15.4423 - val_mae: 2.8799 Epoch 53/200 301/301 [==============================] - 0s 125us/sample - loss: 13.0397 - mae: 2.7109 - val_loss: 15.3080 - val_mae: 2.8572 Epoch 54/200 301/301 [==============================] - 0s 123us/sample - loss: 13.0979 - mae: 2.7281 - val_loss: 15.3058 - val_mae: 2.8813 Epoch 55/200 301/301 [==============================] - 0s 122us/sample - loss: 13.0042 - mae: 2.7239 - val_loss: 15.2463 - val_mae: 2.9135 Epoch 56/200 301/301 [==============================] - 0s 130us/sample - loss: 12.9081 - mae: 2.7020 - val_loss: 15.0441 - val_mae: 2.8349 Epoch 57/200 301/301 [==============================] - 0s 126us/sample - loss: 12.9731 - mae: 2.7309 - val_loss: 15.2925 - val_mae: 2.8635 Epoch 58/200 301/301 [==============================] - 0s 123us/sample - loss: 12.8796 - mae: 2.7103 - val_loss: 15.6531 - val_mae: 2.8777 Epoch 59/200 301/301 [==============================] - 0s 124us/sample - loss: 12.8523 - mae: 2.7127 - val_loss: 15.3808 - val_mae: 2.8647 Epoch 60/200 301/301 [==============================] - 0s 124us/sample - loss: 12.7105 - mae: 2.7005 - val_loss: 15.7587 - val_mae: 2.9765 Epoch 61/200 301/301 [==============================] - 0s 121us/sample - loss: 12.8449 - mae: 2.7382 - val_loss: 15.9367 - val_mae: 2.8921 Epoch 62/200 301/301 [==============================] - 0s 127us/sample - loss: 13.0402 - mae: 2.7317 - val_loss: 16.3820 - val_mae: 2.9362 Epoch 63/200 301/301 [==============================] - 0s 127us/sample - loss: 12.9783 - mae: 2.7475 - val_loss: 15.2263 - val_mae: 2.8277 Epoch 64/200 301/301 [==============================] - 0s 126us/sample - loss: 12.6177 - mae: 2.6846 - val_loss: 14.9181 - val_mae: 2.8200 Epoch 65/200 301/301 [==============================] - 0s 126us/sample - loss: 12.4389 - mae: 2.6758 - val_loss: 14.9955 - val_mae: 2.8444 Epoch 66/200 301/301 [==============================] - 0s 124us/sample - loss: 12.5869 - mae: 2.6952 - val_loss: 15.3776 - val_mae: 2.9260 Epoch 67/200 301/301 [==============================] - 0s 127us/sample - loss: 13.1804 - mae: 2.7510 - val_loss: 14.9607 - val_mae: 2.8535 Epoch 68/200 301/301 [==============================] - 0s 126us/sample - loss: 12.6151 - mae: 2.6696 - val_loss: 14.7938 - val_mae: 2.8193 Epoch 69/200 301/301 [==============================] - 0s 125us/sample - loss: 12.4815 - mae: 2.6702 - val_loss: 15.1933 - val_mae: 2.8294 Epoch 70/200 301/301 [==============================] - 0s 127us/sample - loss: 12.5620 - mae: 2.6465 - val_loss: 16.3481 - val_mae: 3.1338 Epoch 71/200 301/301 [==============================] - 0s 126us/sample - loss: 12.6492 - mae: 2.6646 - val_loss: 14.7940 - val_mae: 2.7945 Epoch 72/200 301/301 [==============================] - 0s 127us/sample - loss: 12.2102 - mae: 2.6209 - val_loss: 14.8311 - val_mae: 2.8597 Epoch 73/200 301/301 [==============================] - 0s 130us/sample - loss: 12.2318 - mae: 2.6297 - val_loss: 14.9179 - val_mae: 2.8882 Epoch 74/200 301/301 [==============================] - 0s 122us/sample - loss: 12.1703 - mae: 2.6429 - val_loss: 14.7952 - val_mae: 2.7910 Epoch 75/200 301/301 [==============================] - 0s 121us/sample - loss: 12.1712 - mae: 2.6242 - val_loss: 14.6987 - val_mae: 2.7972 Epoch 76/200 301/301 [==============================] - 0s 122us/sample - loss: 12.2342 - mae: 2.6395 - val_loss: 14.7024 - val_mae: 2.8111 Epoch 77/200 301/301 [==============================] - 0s 123us/sample - loss: 12.1362 - mae: 2.6178 - val_loss: 14.6714 - val_mae: 2.7844 Epoch 78/200 301/301 [==============================] - 0s 122us/sample - loss: 11.9614 - mae: 2.5838 - val_loss: 15.3918 - val_mae: 3.0371 Epoch 79/200 301/301 [==============================] - 0s 121us/sample - loss: 12.5722 - mae: 2.6644 - val_loss: 14.7159 - val_mae: 2.8378 Epoch 80/200 301/301 [==============================] - 0s 123us/sample - loss: 12.0639 - mae: 2.6359 - val_loss: 15.3039 - val_mae: 2.8286 Epoch 81/200 301/301 [==============================] - 0s 126us/sample - loss: 12.6998 - mae: 2.7184 - val_loss: 14.6700 - val_mae: 2.7831 Epoch 82/200 301/301 [==============================] - 0s 130us/sample - loss: 12.2679 - mae: 2.6559 - val_loss: 15.1330 - val_mae: 2.9718 Epoch 83/200 301/301 [==============================] - 0s 127us/sample - loss: 12.0866 - mae: 2.6309 - val_loss: 14.8407 - val_mae: 2.7693 Epoch 84/200 301/301 [==============================] - 0s 132us/sample - loss: 11.8089 - mae: 2.5896 - val_loss: 14.6265 - val_mae: 2.8633 Epoch 85/200 301/301 [==============================] - 0s 121us/sample - loss: 12.0862 - mae: 2.6293 - val_loss: 14.8167 - val_mae: 2.8471 Epoch 86/200 301/301 [==============================] - 0s 128us/sample - loss: 11.8023 - mae: 2.5500 - val_loss: 14.5620 - val_mae: 2.8209 Epoch 87/200 301/301 [==============================] - 0s 129us/sample - loss: 11.9156 - mae: 2.6155 - val_loss: 14.7479 - val_mae: 2.9354 Epoch 88/200 301/301 [==============================] - 0s 126us/sample - loss: 11.8480 - mae: 2.5941 - val_loss: 14.4978 - val_mae: 2.7933 Epoch 89/200 301/301 [==============================] - 0s 126us/sample - loss: 11.9734 - mae: 2.6338 - val_loss: 16.0863 - val_mae: 2.9231 Epoch 90/200 301/301 [==============================] - 0s 126us/sample - loss: 11.8548 - mae: 2.6098 - val_loss: 14.5206 - val_mae: 2.7715 Epoch 91/200 301/301 [==============================] - 0s 124us/sample - loss: 11.8521 - mae: 2.5836 - val_loss: 14.7502 - val_mae: 2.9304 Epoch 92/200 301/301 [==============================] - 0s 125us/sample - loss: 11.9799 - mae: 2.5687 - val_loss: 14.4606 - val_mae: 2.7714 Epoch 93/200 301/301 [==============================] - 0s 125us/sample - loss: 11.7796 - mae: 2.5912 - val_loss: 14.5859 - val_mae: 2.7691 Epoch 94/200 301/301 [==============================] - 0s 125us/sample - loss: 11.6728 - mae: 2.5615 - val_loss: 14.2912 - val_mae: 2.7742 Epoch 95/200 301/301 [==============================] - 0s 124us/sample - loss: 12.0226 - mae: 2.6269 - val_loss: 14.9655 - val_mae: 2.7869 Epoch 96/200 301/301 [==============================] - 0s 124us/sample - loss: 11.6121 - mae: 2.5627 - val_loss: 14.4440 - val_mae: 2.7657 Epoch 97/200 301/301 [==============================] - 0s 124us/sample - loss: 11.9196 - mae: 2.6055 - val_loss: 14.7297 - val_mae: 2.7922 Epoch 98/200 301/301 [==============================] - 0s 125us/sample - loss: 11.4272 - mae: 2.5038 - val_loss: 14.2012 - val_mae: 2.8051 Epoch 99/200 301/301 [==============================] - 0s 122us/sample - loss: 11.4438 - mae: 2.5273 - val_loss: 14.9132 - val_mae: 2.7415 Epoch 100/200 301/301 [==============================] - 0s 129us/sample - loss: 11.7952 - mae: 2.5831 - val_loss: 14.5069 - val_mae: 2.7239 Epoch 101/200 301/301 [==============================] - 0s 128us/sample - loss: 11.6462 - mae: 2.5674 - val_loss: 14.5896 - val_mae: 2.7600 Epoch 102/200 301/301 [==============================] - 0s 123us/sample - loss: 11.5965 - mae: 2.5167 - val_loss: 14.7579 - val_mae: 2.7818 Epoch 103/200 301/301 [==============================] - 0s 123us/sample - loss: 11.7869 - mae: 2.5913 - val_loss: 14.8849 - val_mae: 2.7992 Epoch 104/200 301/301 [==============================] - 0s 132us/sample - loss: 11.5269 - mae: 2.5476 - val_loss: 14.3206 - val_mae: 2.7703 Epoch 105/200 301/301 [==============================] - 0s 125us/sample - loss: 11.6149 - mae: 2.5572 - val_loss: 14.0127 - val_mae: 2.7370 Epoch 106/200 301/301 [==============================] - 0s 126us/sample - loss: 11.5614 - mae: 2.5606 - val_loss: 13.9246 - val_mae: 2.6899 Epoch 107/200 301/301 [==============================] - 0s 131us/sample - loss: 11.3030 - mae: 2.5331 - val_loss: 15.5828 - val_mae: 2.8289 Epoch 108/200 301/301 [==============================] - 0s 124us/sample - loss: 11.5861 - mae: 2.5174 - val_loss: 14.1820 - val_mae: 2.6934 Epoch 109/200 301/301 [==============================] - 0s 121us/sample - loss: 11.6016 - mae: 2.5258 - val_loss: 13.8868 - val_mae: 2.7335 Epoch 110/200 301/301 [==============================] - 0s 127us/sample - loss: 11.2151 - mae: 2.5129 - val_loss: 13.9155 - val_mae: 2.7496 Epoch 111/200 301/301 [==============================] - 0s 125us/sample - loss: 11.2963 - mae: 2.5338 - val_loss: 14.4778 - val_mae: 2.7391 Epoch 112/200 301/301 [==============================] - 0s 124us/sample - loss: 11.2321 - mae: 2.5041 - val_loss: 13.9446 - val_mae: 2.7306 Epoch 113/200 301/301 [==============================] - 0s 125us/sample - loss: 11.3650 - mae: 2.5057 - val_loss: 13.8908 - val_mae: 2.7930 Epoch 114/200 301/301 [==============================] - 0s 126us/sample - loss: 11.2310 - mae: 2.5088 - val_loss: 13.7593 - val_mae: 2.6592 Epoch 115/200 301/301 [==============================] - 0s 125us/sample - loss: 11.1910 - mae: 2.5320 - val_loss: 14.0972 - val_mae: 2.6777 Epoch 116/200 301/301 [==============================] - 0s 124us/sample - loss: 11.0660 - mae: 2.4920 - val_loss: 13.8059 - val_mae: 2.7122 Epoch 117/200 301/301 [==============================] - 0s 126us/sample - loss: 11.2273 - mae: 2.4725 - val_loss: 13.8863 - val_mae: 2.6983 Epoch 118/200 301/301 [==============================] - 0s 124us/sample - loss: 11.2490 - mae: 2.5225 - val_loss: 13.7916 - val_mae: 2.6931 Epoch 119/200 301/301 [==============================] - 0s 124us/sample - loss: 10.9570 - mae: 2.4618 - val_loss: 13.7727 - val_mae: 2.7276 Epoch 120/200 301/301 [==============================] - 0s 122us/sample - loss: 10.9745 - mae: 2.4728 - val_loss: 13.7512 - val_mae: 2.7272 Epoch 121/200 301/301 [==============================] - 0s 123us/sample - loss: 11.1078 - mae: 2.4652 - val_loss: 14.0910 - val_mae: 2.7395 Epoch 122/200 301/301 [==============================] - 0s 122us/sample - loss: 10.9710 - mae: 2.4479 - val_loss: 13.5063 - val_mae: 2.7159 Epoch 123/200 301/301 [==============================] - 0s 119us/sample - loss: 10.9362 - mae: 2.4916 - val_loss: 15.1437 - val_mae: 2.7751 Epoch 124/200 301/301 [==============================] - 0s 127us/sample - loss: 11.0378 - mae: 2.5092 - val_loss: 14.0337 - val_mae: 2.8268 Epoch 125/200 301/301 [==============================] - 0s 126us/sample - loss: 11.5147 - mae: 2.5380 - val_loss: 13.5444 - val_mae: 2.6830 Epoch 126/200 301/301 [==============================] - 0s 122us/sample - loss: 10.9699 - mae: 2.4805 - val_loss: 13.6729 - val_mae: 2.7639 Epoch 127/200 301/301 [==============================] - 0s 122us/sample - loss: 11.3789 - mae: 2.5701 - val_loss: 13.5392 - val_mae: 2.7246 Epoch 128/200 301/301 [==============================] - 0s 123us/sample - loss: 10.8578 - mae: 2.4347 - val_loss: 13.4989 - val_mae: 2.6617 Epoch 129/200 301/301 [==============================] - 0s 123us/sample - loss: 10.8655 - mae: 2.4326 - val_loss: 13.6245 - val_mae: 2.7480 Epoch 130/200 301/301 [==============================] - 0s 119us/sample - loss: 10.8961 - mae: 2.4944 - val_loss: 13.6211 - val_mae: 2.6380 Epoch 131/200 301/301 [==============================] - 0s 125us/sample - loss: 10.8402 - mae: 2.4669 - val_loss: 14.2412 - val_mae: 2.6718 Epoch 132/200 301/301 [==============================] - 0s 129us/sample - loss: 11.1005 - mae: 2.5083 - val_loss: 13.4320 - val_mae: 2.6570 Epoch 133/200 301/301 [==============================] - 0s 126us/sample - loss: 10.6566 - mae: 2.4277 - val_loss: 14.0834 - val_mae: 2.8239 Epoch 134/200 301/301 [==============================] - 0s 129us/sample - loss: 11.1629 - mae: 2.5130 - val_loss: 13.4869 - val_mae: 2.6774 Epoch 135/200 301/301 [==============================] - 0s 125us/sample - loss: 10.6335 - mae: 2.4491 - val_loss: 13.7787 - val_mae: 2.7020 Epoch 136/200 301/301 [==============================] - 0s 126us/sample - loss: 10.8491 - mae: 2.4308 - val_loss: 13.5271 - val_mae: 2.6516 Epoch 137/200 301/301 [==============================] - 0s 124us/sample - loss: 10.6474 - mae: 2.4299 - val_loss: 14.0537 - val_mae: 2.6536 Epoch 138/200 301/301 [==============================] - 0s 121us/sample - loss: 10.6551 - mae: 2.4071 - val_loss: 13.4262 - val_mae: 2.6737 Epoch 139/200 301/301 [==============================] - 0s 124us/sample - loss: 10.7765 - mae: 2.4066 - val_loss: 13.4761 - val_mae: 2.6477 Epoch 140/200 301/301 [==============================] - 0s 126us/sample - loss: 11.0118 - mae: 2.4598 - val_loss: 13.8299 - val_mae: 2.7103 Epoch 141/200 301/301 [==============================] - 0s 123us/sample - loss: 10.5275 - mae: 2.4313 - val_loss: 14.1946 - val_mae: 2.7105 Epoch 142/200 301/301 [==============================] - 0s 125us/sample - loss: 10.9351 - mae: 2.4587 - val_loss: 13.5470 - val_mae: 2.6529 Epoch 143/200 301/301 [==============================] - 0s 127us/sample - loss: 10.4779 - mae: 2.3891 - val_loss: 14.0827 - val_mae: 2.6449 Epoch 144/200 301/301 [==============================] - 0s 123us/sample - loss: 10.5069 - mae: 2.4042 - val_loss: 13.4652 - val_mae: 2.6955 Epoch 145/200 301/301 [==============================] - 0s 123us/sample - loss: 10.6262 - mae: 2.4114 - val_loss: 13.4862 - val_mae: 2.6285 Epoch 146/200 301/301 [==============================] - 0s 125us/sample - loss: 10.7922 - mae: 2.4643 - val_loss: 13.6255 - val_mae: 2.6418 Epoch 147/200 301/301 [==============================] - 0s 129us/sample - loss: 10.5949 - mae: 2.4249 - val_loss: 13.4218 - val_mae: 2.6622 Epoch 148/200 301/301 [==============================] - 0s 125us/sample - loss: 10.3428 - mae: 2.4065 - val_loss: 13.4570 - val_mae: 2.6312 Epoch 149/200 301/301 [==============================] - 0s 128us/sample - loss: 10.4530 - mae: 2.3784 - val_loss: 13.5794 - val_mae: 2.6498 Epoch 150/200 301/301 [==============================] - 0s 128us/sample - loss: 10.2622 - mae: 2.3485 - val_loss: 13.4721 - val_mae: 2.6867 Epoch 151/200 301/301 [==============================] - 0s 125us/sample - loss: 10.5125 - mae: 2.4402 - val_loss: 13.5680 - val_mae: 2.6360 Epoch 152/200 301/301 [==============================] - 0s 126us/sample - loss: 10.3101 - mae: 2.3630 - val_loss: 14.1455 - val_mae: 2.6823 Epoch 153/200 301/301 [==============================] - 0s 128us/sample - loss: 10.5566 - mae: 2.4043 - val_loss: 13.7561 - val_mae: 2.6771 Epoch 154/200 301/301 [==============================] - 0s 124us/sample - loss: 10.6535 - mae: 2.4067 - val_loss: 14.6050 - val_mae: 2.7368 Epoch 155/200 301/301 [==============================] - 0s 127us/sample - loss: 10.8082 - mae: 2.4133 - val_loss: 13.4230 - val_mae: 2.6735 Epoch 156/200 301/301 [==============================] - 0s 127us/sample - loss: 10.5144 - mae: 2.3796 - val_loss: 13.4568 - val_mae: 2.6810 Epoch 157/200 301/301 [==============================] - 0s 125us/sample - loss: 10.2136 - mae: 2.3644 - val_loss: 13.7438 - val_mae: 2.6397 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 40us/sample - loss: 9.1205 - mae: 2.5882 [CV] END learning_rate=0.002433174108561598, n_hidden=1, n_neurons=34; total time= 6.6s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 0s 2ms/sample - loss: 535.7877 - mae: 21.7320 - val_loss: 462.5388 - val_mae: 20.4686 Epoch 2/200 302/302 [==============================] - 0s 122us/sample - loss: 438.6686 - mae: 20.0580 - val_loss: 382.3224 - val_mae: 18.6804 Epoch 3/200 302/302 [==============================] - 0s 130us/sample - loss: 358.3420 - mae: 18.1201 - val_loss: 301.9583 - val_mae: 16.4634 Epoch 4/200 302/302 [==============================] - 0s 130us/sample - loss: 275.2187 - mae: 15.6449 - val_loss: 219.4583 - val_mae: 13.7461 Epoch 5/200 302/302 [==============================] - 0s 122us/sample - loss: 194.4652 - mae: 12.8368 - val_loss: 144.7996 - val_mae: 10.8005 Epoch 6/200 302/302 [==============================] - 0s 121us/sample - loss: 123.7184 - mae: 9.8600 - val_loss: 86.5387 - val_mae: 7.8730 Epoch 7/200 302/302 [==============================] - 0s 129us/sample - loss: 70.8898 - mae: 6.9665 - val_loss: 47.5347 - val_mae: 5.4232 Epoch 8/200 302/302 [==============================] - 0s 125us/sample - loss: 39.6371 - mae: 4.8864 - val_loss: 29.2697 - val_mae: 3.8977 Epoch 9/200 302/302 [==============================] - 0s 127us/sample - loss: 26.8347 - mae: 3.9086 - val_loss: 25.0196 - val_mae: 3.6027 Epoch 10/200 302/302 [==============================] - 0s 126us/sample - loss: 22.4555 - mae: 3.5063 - val_loss: 22.8781 - val_mae: 3.5071 Epoch 11/200 302/302 [==============================] - 0s 124us/sample - loss: 21.7717 - mae: 3.4167 - val_loss: 22.6684 - val_mae: 3.4610 Epoch 12/200 302/302 [==============================] - 0s 123us/sample - loss: 21.3428 - mae: 3.3787 - val_loss: 21.9786 - val_mae: 3.5037 Epoch 13/200 302/302 [==============================] - 0s 127us/sample - loss: 19.8217 - mae: 3.2227 - val_loss: 21.4578 - val_mae: 3.4306 Epoch 14/200 302/302 [==============================] - 0s 125us/sample - loss: 19.2783 - mae: 3.1818 - val_loss: 22.2091 - val_mae: 3.5395 Epoch 15/200 302/302 [==============================] - 0s 125us/sample - loss: 19.4591 - mae: 3.2237 - val_loss: 21.5629 - val_mae: 3.4053 Epoch 16/200 302/302 [==============================] - 0s 127us/sample - loss: 18.6910 - mae: 3.1241 - val_loss: 21.0493 - val_mae: 3.4265 Epoch 17/200 302/302 [==============================] - 0s 129us/sample - loss: 18.2593 - mae: 3.0820 - val_loss: 21.0529 - val_mae: 3.4196 Epoch 18/200 302/302 [==============================] - 0s 124us/sample - loss: 18.2207 - mae: 3.0682 - val_loss: 20.2943 - val_mae: 3.3452 Epoch 19/200 302/302 [==============================] - 0s 126us/sample - loss: 17.7842 - mae: 3.0411 - val_loss: 20.3352 - val_mae: 3.3244 Epoch 20/200 302/302 [==============================] - 0s 123us/sample - loss: 17.7149 - mae: 3.0421 - val_loss: 20.5227 - val_mae: 3.3504 Epoch 21/200 302/302 [==============================] - 0s 121us/sample - loss: 17.3942 - mae: 2.9973 - val_loss: 20.3312 - val_mae: 3.3335 Epoch 22/200 302/302 [==============================] - 0s 123us/sample - loss: 17.4395 - mae: 3.0123 - val_loss: 20.1943 - val_mae: 3.3481 Epoch 23/200 302/302 [==============================] - 0s 128us/sample - loss: 17.1953 - mae: 2.9773 - val_loss: 20.5862 - val_mae: 3.4110 Epoch 24/200 302/302 [==============================] - 0s 129us/sample - loss: 16.9376 - mae: 2.9516 - val_loss: 20.0631 - val_mae: 3.3153 Epoch 25/200 302/302 [==============================] - 0s 123us/sample - loss: 17.1080 - mae: 2.9944 - val_loss: 19.7558 - val_mae: 3.3130 Epoch 26/200 302/302 [==============================] - 0s 120us/sample - loss: 17.1868 - mae: 2.9587 - val_loss: 20.0200 - val_mae: 3.3552 Epoch 27/200 302/302 [==============================] - 0s 127us/sample - loss: 16.8580 - mae: 2.9267 - val_loss: 19.6552 - val_mae: 3.2716 Epoch 28/200 302/302 [==============================] - 0s 123us/sample - loss: 16.6777 - mae: 2.9713 - val_loss: 19.7401 - val_mae: 3.3142 Epoch 29/200 302/302 [==============================] - 0s 121us/sample - loss: 16.1591 - mae: 2.8965 - val_loss: 19.5109 - val_mae: 3.2329 Epoch 30/200 302/302 [==============================] - 0s 125us/sample - loss: 17.4522 - mae: 3.0709 - val_loss: 21.9439 - val_mae: 3.5848 Epoch 31/200 302/302 [==============================] - 0s 123us/sample - loss: 16.8474 - mae: 2.9489 - val_loss: 19.2740 - val_mae: 3.2745 Epoch 32/200 302/302 [==============================] - 0s 124us/sample - loss: 16.4126 - mae: 2.9135 - val_loss: 19.5236 - val_mae: 3.3188 Epoch 33/200 302/302 [==============================] - 0s 123us/sample - loss: 16.3173 - mae: 2.9500 - val_loss: 20.2673 - val_mae: 3.4093 Epoch 34/200 302/302 [==============================] - 0s 125us/sample - loss: 15.7580 - mae: 2.8643 - val_loss: 18.6437 - val_mae: 3.2156 Epoch 35/200 302/302 [==============================] - 0s 124us/sample - loss: 15.6678 - mae: 2.8653 - val_loss: 18.5620 - val_mae: 3.2211 Epoch 36/200 302/302 [==============================] - 0s 120us/sample - loss: 15.7343 - mae: 2.8914 - val_loss: 21.8787 - val_mae: 3.5518 Epoch 37/200 302/302 [==============================] - 0s 125us/sample - loss: 16.7957 - mae: 2.9374 - val_loss: 18.7973 - val_mae: 3.2314 Epoch 38/200 302/302 [==============================] - 0s 119us/sample - loss: 15.4862 - mae: 2.8469 - val_loss: 18.8719 - val_mae: 3.2629 Epoch 39/200 302/302 [==============================] - 0s 130us/sample - loss: 15.4798 - mae: 2.8667 - val_loss: 18.6328 - val_mae: 3.2233 Epoch 40/200 302/302 [==============================] - 0s 121us/sample - loss: 15.4508 - mae: 2.8116 - val_loss: 18.1481 - val_mae: 3.1724 Epoch 41/200 302/302 [==============================] - 0s 121us/sample - loss: 15.5216 - mae: 2.8986 - val_loss: 17.8884 - val_mae: 3.1261 Epoch 42/200 302/302 [==============================] - 0s 126us/sample - loss: 15.1329 - mae: 2.8413 - val_loss: 18.5506 - val_mae: 3.2298 Epoch 43/200 302/302 [==============================] - 0s 128us/sample - loss: 15.0419 - mae: 2.8002 - val_loss: 18.2256 - val_mae: 3.1949 Epoch 44/200 302/302 [==============================] - 0s 127us/sample - loss: 14.9766 - mae: 2.8040 - val_loss: 17.7643 - val_mae: 3.1478 Epoch 45/200 302/302 [==============================] - 0s 124us/sample - loss: 15.3035 - mae: 2.8433 - val_loss: 17.5835 - val_mae: 3.1413 Epoch 46/200 302/302 [==============================] - 0s 123us/sample - loss: 14.7492 - mae: 2.8247 - val_loss: 17.7584 - val_mae: 3.1503 Epoch 47/200 302/302 [==============================] - 0s 122us/sample - loss: 14.9226 - mae: 2.8337 - val_loss: 17.4193 - val_mae: 3.0947 Epoch 48/200 302/302 [==============================] - 0s 128us/sample - loss: 14.2769 - mae: 2.7368 - val_loss: 17.3007 - val_mae: 3.1044 Epoch 49/200 302/302 [==============================] - 0s 131us/sample - loss: 14.2896 - mae: 2.7433 - val_loss: 17.4041 - val_mae: 3.1387 Epoch 50/200 302/302 [==============================] - 0s 120us/sample - loss: 14.4078 - mae: 2.7905 - val_loss: 18.2481 - val_mae: 3.2024 Epoch 51/200 302/302 [==============================] - 0s 127us/sample - loss: 14.3362 - mae: 2.7482 - val_loss: 16.9788 - val_mae: 3.0712 Epoch 52/200 302/302 [==============================] - 0s 126us/sample - loss: 13.9598 - mae: 2.6659 - val_loss: 16.8784 - val_mae: 3.0931 Epoch 53/200 302/302 [==============================] - 0s 125us/sample - loss: 13.7244 - mae: 2.6911 - val_loss: 16.5075 - val_mae: 2.9993 Epoch 54/200 302/302 [==============================] - 0s 129us/sample - loss: 13.9350 - mae: 2.6958 - val_loss: 16.3326 - val_mae: 2.9862 Epoch 55/200 302/302 [==============================] - 0s 127us/sample - loss: 13.5358 - mae: 2.7413 - val_loss: 17.1861 - val_mae: 3.0977 Epoch 56/200 302/302 [==============================] - 0s 124us/sample - loss: 13.6755 - mae: 2.7058 - val_loss: 16.1496 - val_mae: 3.0287 Epoch 57/200 302/302 [==============================] - 0s 125us/sample - loss: 13.4099 - mae: 2.6773 - val_loss: 16.7868 - val_mae: 3.0929 Epoch 58/200 302/302 [==============================] - 0s 125us/sample - loss: 13.6819 - mae: 2.7201 - val_loss: 16.0067 - val_mae: 2.9598 Epoch 59/200 302/302 [==============================] - 0s 127us/sample - loss: 13.5639 - mae: 2.7237 - val_loss: 15.9806 - val_mae: 2.9875 Epoch 60/200 302/302 [==============================] - 0s 126us/sample - loss: 13.0557 - mae: 2.6621 - val_loss: 17.3048 - val_mae: 3.1102 Epoch 61/200 302/302 [==============================] - 0s 123us/sample - loss: 13.3616 - mae: 2.6512 - val_loss: 16.8064 - val_mae: 3.1168 Epoch 62/200 302/302 [==============================] - 0s 126us/sample - loss: 13.1783 - mae: 2.6700 - val_loss: 15.7486 - val_mae: 2.9173 Epoch 63/200 302/302 [==============================] - 0s 118us/sample - loss: 12.9028 - mae: 2.6442 - val_loss: 16.2266 - val_mae: 3.0303 Epoch 64/200 302/302 [==============================] - 0s 119us/sample - loss: 12.7878 - mae: 2.6144 - val_loss: 15.5988 - val_mae: 2.9731 Epoch 65/200 302/302 [==============================] - 0s 120us/sample - loss: 12.6940 - mae: 2.6244 - val_loss: 15.5678 - val_mae: 2.9273 Epoch 66/200 302/302 [==============================] - 0s 127us/sample - loss: 12.6239 - mae: 2.5898 - val_loss: 15.3552 - val_mae: 2.9439 Epoch 67/200 302/302 [==============================] - 0s 125us/sample - loss: 12.4318 - mae: 2.5889 - val_loss: 15.0084 - val_mae: 2.8697 Epoch 68/200 302/302 [==============================] - 0s 123us/sample - loss: 12.9051 - mae: 2.6438 - val_loss: 15.3390 - val_mae: 2.9155 Epoch 69/200 302/302 [==============================] - 0s 122us/sample - loss: 12.3574 - mae: 2.6026 - val_loss: 15.6635 - val_mae: 2.9844 Epoch 70/200 302/302 [==============================] - 0s 124us/sample - loss: 12.3919 - mae: 2.5358 - val_loss: 15.1258 - val_mae: 2.8576 Epoch 71/200 302/302 [==============================] - 0s 122us/sample - loss: 12.5105 - mae: 2.6636 - val_loss: 15.4898 - val_mae: 2.9303 Epoch 72/200 302/302 [==============================] - 0s 119us/sample - loss: 12.2154 - mae: 2.5531 - val_loss: 15.2891 - val_mae: 2.8953 Epoch 73/200 302/302 [==============================] - 0s 122us/sample - loss: 12.2463 - mae: 2.5789 - val_loss: 14.9280 - val_mae: 2.8767 Epoch 74/200 302/302 [==============================] - 0s 125us/sample - loss: 12.0272 - mae: 2.5325 - val_loss: 14.9685 - val_mae: 2.8938 Epoch 75/200 302/302 [==============================] - 0s 123us/sample - loss: 12.2007 - mae: 2.5511 - val_loss: 15.1325 - val_mae: 2.9172 Epoch 76/200 302/302 [==============================] - 0s 124us/sample - loss: 12.0736 - mae: 2.5396 - val_loss: 15.3722 - val_mae: 2.9460 Epoch 77/200 302/302 [==============================] - 0s 124us/sample - loss: 12.2466 - mae: 2.5653 - val_loss: 14.6124 - val_mae: 2.8403 Epoch 78/200 302/302 [==============================] - 0s 125us/sample - loss: 11.9173 - mae: 2.5499 - val_loss: 14.8750 - val_mae: 2.8348 Epoch 79/200 302/302 [==============================] - 0s 122us/sample - loss: 12.0056 - mae: 2.5452 - val_loss: 14.5385 - val_mae: 2.8259 Epoch 80/200 302/302 [==============================] - 0s 121us/sample - loss: 11.8878 - mae: 2.5186 - val_loss: 14.7268 - val_mae: 2.8534 Epoch 81/200 302/302 [==============================] - 0s 123us/sample - loss: 11.7379 - mae: 2.5227 - val_loss: 14.6384 - val_mae: 2.8309 Epoch 82/200 302/302 [==============================] - 0s 128us/sample - loss: 11.6655 - mae: 2.4746 - val_loss: 14.5573 - val_mae: 2.8008 Epoch 83/200 302/302 [==============================] - 0s 124us/sample - loss: 11.6862 - mae: 2.5278 - val_loss: 15.0604 - val_mae: 2.8964 Epoch 84/200 302/302 [==============================] - 0s 125us/sample - loss: 11.4479 - mae: 2.5010 - val_loss: 14.7252 - val_mae: 2.8156 Epoch 85/200 302/302 [==============================] - 0s 125us/sample - loss: 11.6134 - mae: 2.5269 - val_loss: 15.4069 - val_mae: 2.8997 Epoch 86/200 302/302 [==============================] - 0s 121us/sample - loss: 11.9497 - mae: 2.5594 - val_loss: 15.4574 - val_mae: 2.9557 Epoch 87/200 302/302 [==============================] - 0s 121us/sample - loss: 11.4318 - mae: 2.4791 - val_loss: 14.6146 - val_mae: 2.8118 Epoch 88/200 302/302 [==============================] - 0s 123us/sample - loss: 11.5417 - mae: 2.5106 - val_loss: 14.7227 - val_mae: 2.8375 Epoch 89/200 302/302 [==============================] - 0s 118us/sample - loss: 11.3318 - mae: 2.4582 - val_loss: 14.2808 - val_mae: 2.7988 Epoch 90/200 302/302 [==============================] - 0s 122us/sample - loss: 11.4149 - mae: 2.4679 - val_loss: 14.6303 - val_mae: 2.8412 Epoch 91/200 302/302 [==============================] - 0s 123us/sample - loss: 11.5460 - mae: 2.5122 - val_loss: 14.5458 - val_mae: 2.8324 Epoch 92/200 302/302 [==============================] - 0s 125us/sample - loss: 11.4564 - mae: 2.4808 - val_loss: 14.7673 - val_mae: 2.8596 Epoch 93/200 302/302 [==============================] - 0s 122us/sample - loss: 11.3499 - mae: 2.4437 - val_loss: 14.1168 - val_mae: 2.7758 Epoch 94/200 302/302 [==============================] - 0s 122us/sample - loss: 11.1297 - mae: 2.4539 - val_loss: 14.0296 - val_mae: 2.7506 Epoch 95/200 302/302 [==============================] - 0s 120us/sample - loss: 11.1884 - mae: 2.4621 - val_loss: 14.5742 - val_mae: 2.8083 Epoch 96/200 302/302 [==============================] - 0s 122us/sample - loss: 12.0505 - mae: 2.6166 - val_loss: 13.9791 - val_mae: 2.7702 Epoch 97/200 302/302 [==============================] - 0s 122us/sample - loss: 11.6089 - mae: 2.5153 - val_loss: 14.6636 - val_mae: 2.8331 Epoch 98/200 302/302 [==============================] - 0s 130us/sample - loss: 11.4284 - mae: 2.4752 - val_loss: 14.8487 - val_mae: 2.8376 Epoch 99/200 302/302 [==============================] - 0s 124us/sample - loss: 11.4782 - mae: 2.4578 - val_loss: 14.6811 - val_mae: 2.8353 Epoch 100/200 302/302 [==============================] - 0s 127us/sample - loss: 10.9324 - mae: 2.4043 - val_loss: 14.0904 - val_mae: 2.7521 Epoch 101/200 302/302 [==============================] - 0s 123us/sample - loss: 11.3176 - mae: 2.4501 - val_loss: 13.9289 - val_mae: 2.7370 Epoch 102/200 302/302 [==============================] - 0s 120us/sample - loss: 10.8989 - mae: 2.4237 - val_loss: 14.0933 - val_mae: 2.7658 Epoch 103/200 302/302 [==============================] - 0s 126us/sample - loss: 10.9328 - mae: 2.4434 - val_loss: 13.9518 - val_mae: 2.7478 Epoch 104/200 302/302 [==============================] - 0s 125us/sample - loss: 10.8828 - mae: 2.3973 - val_loss: 13.9956 - val_mae: 2.7632 Epoch 105/200 302/302 [==============================] - 0s 122us/sample - loss: 11.3819 - mae: 2.5223 - val_loss: 14.8417 - val_mae: 2.8258 Epoch 106/200 302/302 [==============================] - 0s 125us/sample - loss: 10.8889 - mae: 2.3972 - val_loss: 14.4718 - val_mae: 2.7977 Epoch 107/200 302/302 [==============================] - 0s 125us/sample - loss: 11.2612 - mae: 2.4701 - val_loss: 13.8562 - val_mae: 2.7495 Epoch 108/200 302/302 [==============================] - 0s 128us/sample - loss: 10.7986 - mae: 2.4063 - val_loss: 14.4922 - val_mae: 2.8103 Epoch 109/200 302/302 [==============================] - 0s 122us/sample - loss: 11.0745 - mae: 2.4349 - val_loss: 14.5124 - val_mae: 2.7917 Epoch 110/200 302/302 [==============================] - 0s 124us/sample - loss: 10.7856 - mae: 2.4010 - val_loss: 14.3464 - val_mae: 2.7711 Epoch 111/200 302/302 [==============================] - 0s 124us/sample - loss: 11.1061 - mae: 2.4360 - val_loss: 14.0615 - val_mae: 2.7601 Epoch 112/200 302/302 [==============================] - 0s 122us/sample - loss: 10.6496 - mae: 2.4249 - val_loss: 13.7487 - val_mae: 2.7226 Epoch 113/200 302/302 [==============================] - 0s 120us/sample - loss: 10.9732 - mae: 2.4108 - val_loss: 14.2745 - val_mae: 2.7822 Epoch 114/200 302/302 [==============================] - 0s 120us/sample - loss: 10.9405 - mae: 2.4468 - val_loss: 14.4191 - val_mae: 2.7893 Epoch 115/200 302/302 [==============================] - 0s 128us/sample - loss: 10.5931 - mae: 2.3893 - val_loss: 13.7668 - val_mae: 2.7141 Epoch 116/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5841 - mae: 2.3788 - val_loss: 14.0676 - val_mae: 2.7232 Epoch 117/200 302/302 [==============================] - 0s 121us/sample - loss: 10.6944 - mae: 2.3968 - val_loss: 13.6370 - val_mae: 2.6897 Epoch 118/200 302/302 [==============================] - 0s 125us/sample - loss: 10.4640 - mae: 2.3834 - val_loss: 13.6885 - val_mae: 2.7140 Epoch 119/200 302/302 [==============================] - 0s 119us/sample - loss: 10.4668 - mae: 2.3787 - val_loss: 14.1850 - val_mae: 2.7647 Epoch 120/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5257 - mae: 2.3749 - val_loss: 14.7514 - val_mae: 2.8130 Epoch 121/200 302/302 [==============================] - 0s 120us/sample - loss: 10.7806 - mae: 2.3929 - val_loss: 13.8770 - val_mae: 2.7293 Epoch 122/200 302/302 [==============================] - 0s 123us/sample - loss: 10.6362 - mae: 2.3755 - val_loss: 13.9544 - val_mae: 2.7461 Epoch 123/200 302/302 [==============================] - 0s 125us/sample - loss: 10.4680 - mae: 2.3806 - val_loss: 14.3888 - val_mae: 2.7565 Epoch 124/200 302/302 [==============================] - 0s 121us/sample - loss: 10.4946 - mae: 2.3649 - val_loss: 13.6867 - val_mae: 2.7083 Epoch 125/200 302/302 [==============================] - 0s 120us/sample - loss: 10.4283 - mae: 2.3445 - val_loss: 14.9430 - val_mae: 2.8519 Epoch 126/200 302/302 [==============================] - 0s 121us/sample - loss: 10.6766 - mae: 2.4076 - val_loss: 14.4529 - val_mae: 2.7902 Epoch 127/200 302/302 [==============================] - 0s 122us/sample - loss: 11.0774 - mae: 2.4569 - val_loss: 13.9213 - val_mae: 2.7080 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 41us/sample - loss: 10.9513 - mae: 2.6871 [CV] END learning_rate=0.002433174108561598, n_hidden=1, n_neurons=34; total time= 5.4s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 0s 2ms/sample - loss: 544.3452 - mae: 21.8312 - val_loss: 481.4826 - val_mae: 20.9401 Epoch 2/200 302/302 [==============================] - 0s 120us/sample - loss: 455.6175 - mae: 20.4469 - val_loss: 399.9270 - val_mae: 19.2351 Epoch 3/200 302/302 [==============================] - 0s 126us/sample - loss: 370.9777 - mae: 18.4742 - val_loss: 316.9194 - val_mae: 17.0074 Epoch 4/200 302/302 [==============================] - 0s 129us/sample - loss: 281.7598 - mae: 16.0055 - val_loss: 227.5155 - val_mae: 14.2283 Epoch 5/200 302/302 [==============================] - 0s 132us/sample - loss: 193.7646 - mae: 13.0047 - val_loss: 144.5603 - val_mae: 11.0330 Epoch 6/200 302/302 [==============================] - 0s 125us/sample - loss: 116.3745 - mae: 9.6649 - val_loss: 81.2839 - val_mae: 7.8345 Epoch 7/200 302/302 [==============================] - 0s 125us/sample - loss: 62.4090 - mae: 6.6113 - val_loss: 42.0018 - val_mae: 5.2180 Epoch 8/200 302/302 [==============================] - 0s 128us/sample - loss: 32.3229 - mae: 4.4682 - val_loss: 25.7605 - val_mae: 3.6995 Epoch 9/200 302/302 [==============================] - 0s 125us/sample - loss: 21.0900 - mae: 3.5499 - val_loss: 22.0950 - val_mae: 3.4148 Epoch 10/200 302/302 [==============================] - 0s 123us/sample - loss: 18.9128 - mae: 3.3260 - val_loss: 22.2323 - val_mae: 3.5129 Epoch 11/200 302/302 [==============================] - 0s 128us/sample - loss: 18.5005 - mae: 3.2843 - val_loss: 21.7992 - val_mae: 3.4677 Epoch 12/200 302/302 [==============================] - 0s 123us/sample - loss: 17.5636 - mae: 3.1629 - val_loss: 20.8592 - val_mae: 3.3226 Epoch 13/200 302/302 [==============================] - 0s 120us/sample - loss: 17.1365 - mae: 3.1134 - val_loss: 21.0135 - val_mae: 3.2719 Epoch 14/200 302/302 [==============================] - 0s 121us/sample - loss: 16.8578 - mae: 3.0969 - val_loss: 20.3437 - val_mae: 3.2159 Epoch 15/200 302/302 [==============================] - 0s 127us/sample - loss: 16.7024 - mae: 3.0566 - val_loss: 20.6424 - val_mae: 3.4087 Epoch 16/200 302/302 [==============================] - 0s 122us/sample - loss: 16.0084 - mae: 3.0175 - val_loss: 19.9440 - val_mae: 3.1912 Epoch 17/200 302/302 [==============================] - 0s 121us/sample - loss: 15.5683 - mae: 2.9327 - val_loss: 20.0026 - val_mae: 3.3124 Epoch 18/200 302/302 [==============================] - 0s 121us/sample - loss: 15.3444 - mae: 2.9238 - val_loss: 19.7628 - val_mae: 3.2068 Epoch 19/200 302/302 [==============================] - 0s 125us/sample - loss: 15.4679 - mae: 2.9309 - val_loss: 19.7022 - val_mae: 3.2259 Epoch 20/200 302/302 [==============================] - 0s 122us/sample - loss: 14.9215 - mae: 2.8167 - val_loss: 20.1326 - val_mae: 3.2537 Epoch 21/200 302/302 [==============================] - 0s 121us/sample - loss: 14.5646 - mae: 2.8683 - val_loss: 20.1339 - val_mae: 3.2916 Epoch 22/200 302/302 [==============================] - 0s 121us/sample - loss: 15.2610 - mae: 2.8804 - val_loss: 19.3452 - val_mae: 3.3061 Epoch 23/200 302/302 [==============================] - 0s 128us/sample - loss: 14.5383 - mae: 2.8457 - val_loss: 19.0873 - val_mae: 3.2146 Epoch 24/200 302/302 [==============================] - 0s 123us/sample - loss: 14.5132 - mae: 2.8119 - val_loss: 18.9507 - val_mae: 3.2201 Epoch 25/200 302/302 [==============================] - 0s 121us/sample - loss: 14.4300 - mae: 2.8436 - val_loss: 19.7939 - val_mae: 3.3171 Epoch 26/200 302/302 [==============================] - 0s 124us/sample - loss: 13.9278 - mae: 2.7610 - val_loss: 18.4560 - val_mae: 3.1132 Epoch 27/200 302/302 [==============================] - 0s 123us/sample - loss: 13.8091 - mae: 2.7789 - val_loss: 18.5772 - val_mae: 3.1338 Epoch 28/200 302/302 [==============================] - 0s 122us/sample - loss: 13.6283 - mae: 2.7349 - val_loss: 18.3668 - val_mae: 3.1100 Epoch 29/200 302/302 [==============================] - 0s 118us/sample - loss: 14.0377 - mae: 2.8053 - val_loss: 18.0787 - val_mae: 3.0844 Epoch 30/200 302/302 [==============================] - 0s 118us/sample - loss: 13.6774 - mae: 2.7639 - val_loss: 18.1714 - val_mae: 3.0906 Epoch 31/200 302/302 [==============================] - 0s 124us/sample - loss: 13.2484 - mae: 2.7354 - val_loss: 18.0114 - val_mae: 3.1202 Epoch 32/200 302/302 [==============================] - 0s 124us/sample - loss: 13.1827 - mae: 2.6924 - val_loss: 18.1589 - val_mae: 3.1508 Epoch 33/200 302/302 [==============================] - 0s 120us/sample - loss: 13.1032 - mae: 2.7014 - val_loss: 17.9713 - val_mae: 3.1292 Epoch 34/200 302/302 [==============================] - 0s 125us/sample - loss: 12.8671 - mae: 2.6890 - val_loss: 18.0316 - val_mae: 3.1196 Epoch 35/200 302/302 [==============================] - 0s 123us/sample - loss: 12.8676 - mae: 2.6598 - val_loss: 17.6636 - val_mae: 3.0566 Epoch 36/200 302/302 [==============================] - 0s 123us/sample - loss: 12.7783 - mae: 2.6730 - val_loss: 17.5332 - val_mae: 3.0941 Epoch 37/200 302/302 [==============================] - 0s 120us/sample - loss: 12.5500 - mae: 2.6730 - val_loss: 17.6433 - val_mae: 3.0635 Epoch 38/200 302/302 [==============================] - 0s 120us/sample - loss: 12.6643 - mae: 2.6597 - val_loss: 17.4268 - val_mae: 3.0914 Epoch 39/200 302/302 [==============================] - 0s 122us/sample - loss: 12.4047 - mae: 2.6283 - val_loss: 17.4137 - val_mae: 3.0445 Epoch 40/200 302/302 [==============================] - 0s 125us/sample - loss: 12.6054 - mae: 2.6774 - val_loss: 17.3460 - val_mae: 3.0897 Epoch 41/200 302/302 [==============================] - 0s 121us/sample - loss: 12.3697 - mae: 2.6491 - val_loss: 17.1303 - val_mae: 3.0402 Epoch 42/200 302/302 [==============================] - 0s 124us/sample - loss: 12.2093 - mae: 2.5926 - val_loss: 17.1357 - val_mae: 3.0418 Epoch 43/200 302/302 [==============================] - 0s 125us/sample - loss: 12.0705 - mae: 2.5751 - val_loss: 17.1412 - val_mae: 3.0425 Epoch 44/200 302/302 [==============================] - 0s 124us/sample - loss: 11.7895 - mae: 2.5363 - val_loss: 17.0617 - val_mae: 3.0089 Epoch 45/200 302/302 [==============================] - 0s 122us/sample - loss: 12.0379 - mae: 2.6160 - val_loss: 16.9234 - val_mae: 3.0148 Epoch 46/200 302/302 [==============================] - 0s 123us/sample - loss: 12.0484 - mae: 2.5562 - val_loss: 16.9236 - val_mae: 3.0232 Epoch 47/200 302/302 [==============================] - 0s 124us/sample - loss: 11.8354 - mae: 2.5945 - val_loss: 16.6841 - val_mae: 2.9851 Epoch 48/200 302/302 [==============================] - 0s 124us/sample - loss: 11.9679 - mae: 2.5649 - val_loss: 16.9398 - val_mae: 3.0371 Epoch 49/200 302/302 [==============================] - 0s 118us/sample - loss: 11.9727 - mae: 2.5862 - val_loss: 16.9237 - val_mae: 3.0332 Epoch 50/200 302/302 [==============================] - 0s 123us/sample - loss: 11.6642 - mae: 2.5917 - val_loss: 18.3551 - val_mae: 3.1665 Epoch 51/200 302/302 [==============================] - 0s 123us/sample - loss: 12.2286 - mae: 2.5937 - val_loss: 16.7253 - val_mae: 3.0207 Epoch 52/200 302/302 [==============================] - 0s 121us/sample - loss: 11.3783 - mae: 2.4960 - val_loss: 16.5281 - val_mae: 2.9854 Epoch 53/200 302/302 [==============================] - 0s 122us/sample - loss: 11.4103 - mae: 2.5445 - val_loss: 16.7146 - val_mae: 3.0063 Epoch 54/200 302/302 [==============================] - 0s 122us/sample - loss: 11.3916 - mae: 2.5003 - val_loss: 16.5765 - val_mae: 2.9837 Epoch 55/200 302/302 [==============================] - 0s 123us/sample - loss: 11.7688 - mae: 2.5800 - val_loss: 16.5630 - val_mae: 2.9956 Epoch 56/200 302/302 [==============================] - 0s 123us/sample - loss: 11.1816 - mae: 2.4868 - val_loss: 16.5555 - val_mae: 3.0175 Epoch 57/200 302/302 [==============================] - 0s 122us/sample - loss: 11.2988 - mae: 2.4937 - val_loss: 16.2167 - val_mae: 2.9775 Epoch 58/200 302/302 [==============================] - 0s 122us/sample - loss: 11.4264 - mae: 2.4972 - val_loss: 16.3213 - val_mae: 2.9912 Epoch 59/200 302/302 [==============================] - 0s 124us/sample - loss: 11.0216 - mae: 2.4757 - val_loss: 16.1666 - val_mae: 2.9646 Epoch 60/200 302/302 [==============================] - 0s 123us/sample - loss: 11.5172 - mae: 2.5431 - val_loss: 16.1270 - val_mae: 2.9862 Epoch 61/200 302/302 [==============================] - 0s 122us/sample - loss: 11.1105 - mae: 2.4809 - val_loss: 16.2797 - val_mae: 2.9921 Epoch 62/200 302/302 [==============================] - 0s 124us/sample - loss: 11.0106 - mae: 2.4666 - val_loss: 17.2964 - val_mae: 3.0837 Epoch 63/200 302/302 [==============================] - 0s 123us/sample - loss: 11.0619 - mae: 2.4909 - val_loss: 16.1617 - val_mae: 2.9739 Epoch 64/200 302/302 [==============================] - 0s 123us/sample - loss: 11.2689 - mae: 2.4657 - val_loss: 16.3112 - val_mae: 2.9630 Epoch 65/200 302/302 [==============================] - 0s 120us/sample - loss: 10.9131 - mae: 2.4794 - val_loss: 16.0604 - val_mae: 2.9490 Epoch 66/200 302/302 [==============================] - 0s 121us/sample - loss: 10.8747 - mae: 2.4349 - val_loss: 16.0803 - val_mae: 2.9457 Epoch 67/200 302/302 [==============================] - 0s 121us/sample - loss: 10.8065 - mae: 2.4336 - val_loss: 16.2826 - val_mae: 2.9935 Epoch 68/200 302/302 [==============================] - 0s 123us/sample - loss: 10.6718 - mae: 2.3864 - val_loss: 16.3847 - val_mae: 3.0051 Epoch 69/200 302/302 [==============================] - 0s 120us/sample - loss: 10.6660 - mae: 2.4307 - val_loss: 15.9764 - val_mae: 2.9437 Epoch 70/200 302/302 [==============================] - 0s 120us/sample - loss: 10.7266 - mae: 2.4348 - val_loss: 16.2850 - val_mae: 2.9749 Epoch 71/200 302/302 [==============================] - 0s 122us/sample - loss: 10.6713 - mae: 2.4002 - val_loss: 15.9858 - val_mae: 2.9484 Epoch 72/200 302/302 [==============================] - 0s 122us/sample - loss: 11.1004 - mae: 2.4807 - val_loss: 15.9375 - val_mae: 2.9731 Epoch 73/200 302/302 [==============================] - 0s 124us/sample - loss: 10.6319 - mae: 2.4239 - val_loss: 15.8340 - val_mae: 2.9482 Epoch 74/200 302/302 [==============================] - 0s 126us/sample - loss: 10.7770 - mae: 2.4331 - val_loss: 15.7582 - val_mae: 2.9249 Epoch 75/200 302/302 [==============================] - 0s 126us/sample - loss: 10.5937 - mae: 2.4359 - val_loss: 16.4575 - val_mae: 3.0099 Epoch 76/200 302/302 [==============================] - 0s 129us/sample - loss: 10.6843 - mae: 2.4492 - val_loss: 16.5317 - val_mae: 3.0112 Epoch 77/200 302/302 [==============================] - 0s 124us/sample - loss: 10.6085 - mae: 2.3891 - val_loss: 15.9256 - val_mae: 2.9570 Epoch 78/200 302/302 [==============================] - 0s 129us/sample - loss: 10.6161 - mae: 2.4230 - val_loss: 15.9005 - val_mae: 2.9714 Epoch 79/200 302/302 [==============================] - 0s 128us/sample - loss: 10.7479 - mae: 2.4176 - val_loss: 15.7037 - val_mae: 2.9226 Epoch 80/200 302/302 [==============================] - 0s 123us/sample - loss: 10.5438 - mae: 2.3941 - val_loss: 16.2691 - val_mae: 2.9676 Epoch 81/200 302/302 [==============================] - 0s 129us/sample - loss: 10.7274 - mae: 2.4551 - val_loss: 15.8808 - val_mae: 2.9698 Epoch 82/200 302/302 [==============================] - 0s 122us/sample - loss: 10.4170 - mae: 2.4024 - val_loss: 15.8299 - val_mae: 2.9459 Epoch 83/200 302/302 [==============================] - 0s 123us/sample - loss: 10.3732 - mae: 2.3974 - val_loss: 15.4895 - val_mae: 2.9070 Epoch 84/200 302/302 [==============================] - 0s 125us/sample - loss: 10.3145 - mae: 2.3823 - val_loss: 16.6521 - val_mae: 3.0052 Epoch 85/200 302/302 [==============================] - 0s 121us/sample - loss: 10.3932 - mae: 2.3536 - val_loss: 15.7695 - val_mae: 2.9727 Epoch 86/200 302/302 [==============================] - 0s 125us/sample - loss: 10.7648 - mae: 2.4260 - val_loss: 15.5266 - val_mae: 2.9038 Epoch 87/200 302/302 [==============================] - 0s 123us/sample - loss: 10.3898 - mae: 2.3781 - val_loss: 15.5457 - val_mae: 2.9247 Epoch 88/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5095 - mae: 2.4464 - val_loss: 15.7404 - val_mae: 2.9520 Epoch 89/200 302/302 [==============================] - 0s 122us/sample - loss: 10.3542 - mae: 2.3875 - val_loss: 16.2083 - val_mae: 3.0026 Epoch 90/200 302/302 [==============================] - 0s 126us/sample - loss: 10.9112 - mae: 2.4436 - val_loss: 15.4003 - val_mae: 2.8886 Epoch 91/200 302/302 [==============================] - 0s 123us/sample - loss: 10.3428 - mae: 2.3643 - val_loss: 15.6858 - val_mae: 2.9135 Epoch 92/200 302/302 [==============================] - 0s 128us/sample - loss: 10.3089 - mae: 2.3525 - val_loss: 15.4928 - val_mae: 2.9079 Epoch 93/200 302/302 [==============================] - 0s 122us/sample - loss: 10.0147 - mae: 2.3226 - val_loss: 15.7602 - val_mae: 2.9264 Epoch 94/200 302/302 [==============================] - 0s 124us/sample - loss: 10.2614 - mae: 2.3580 - val_loss: 15.6577 - val_mae: 2.9165 Epoch 95/200 302/302 [==============================] - 0s 125us/sample - loss: 10.0594 - mae: 2.3772 - val_loss: 16.4337 - val_mae: 2.9746 Epoch 96/200 302/302 [==============================] - 0s 122us/sample - loss: 10.4377 - mae: 2.3807 - val_loss: 15.5093 - val_mae: 2.9163 Epoch 97/200 302/302 [==============================] - 0s 125us/sample - loss: 9.9706 - mae: 2.3136 - val_loss: 15.4976 - val_mae: 2.9046 Epoch 98/200 302/302 [==============================] - 0s 119us/sample - loss: 10.0867 - mae: 2.3225 - val_loss: 15.4863 - val_mae: 2.9283 Epoch 99/200 302/302 [==============================] - 0s 123us/sample - loss: 10.2884 - mae: 2.3670 - val_loss: 15.5184 - val_mae: 2.9249 Epoch 100/200 302/302 [==============================] - 0s 124us/sample - loss: 9.9018 - mae: 2.3045 - val_loss: 15.5922 - val_mae: 2.9273 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 39us/sample - loss: 11.6860 - mae: 2.9253 [CV] END learning_rate=0.002433174108561598, n_hidden=1, n_neurons=34; total time= 4.3s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 441.5955 - mae: 19.4913 - val_loss: 145.9946 - val_mae: 10.9256 Epoch 2/200 301/301 [==============================] - 0s 135us/sample - loss: 55.9294 - mae: 5.8583 - val_loss: 30.4664 - val_mae: 4.1093 Epoch 3/200 301/301 [==============================] - 0s 131us/sample - loss: 25.9709 - mae: 3.7779 - val_loss: 21.0713 - val_mae: 3.3663 Epoch 4/200 301/301 [==============================] - 0s 133us/sample - loss: 20.7054 - mae: 3.4753 - val_loss: 21.9309 - val_mae: 3.4346 Epoch 5/200 301/301 [==============================] - 0s 132us/sample - loss: 22.8403 - mae: 3.7331 - val_loss: 28.0326 - val_mae: 4.0618 Epoch 6/200 301/301 [==============================] - 0s 135us/sample - loss: 23.7752 - mae: 3.6984 - val_loss: 22.0331 - val_mae: 3.4711 Epoch 7/200 301/301 [==============================] - 0s 129us/sample - loss: 20.1582 - mae: 3.4210 - val_loss: 19.3544 - val_mae: 3.2291 Epoch 8/200 301/301 [==============================] - 0s 134us/sample - loss: 18.4060 - mae: 3.2433 - val_loss: 27.0861 - val_mae: 3.9254 Epoch 9/200 301/301 [==============================] - 0s 137us/sample - loss: 19.9602 - mae: 3.3796 - val_loss: 21.6697 - val_mae: 3.5961 Epoch 10/200 301/301 [==============================] - 0s 129us/sample - loss: 17.1874 - mae: 3.1148 - val_loss: 20.2460 - val_mae: 3.4276 Epoch 11/200 301/301 [==============================] - 0s 133us/sample - loss: 16.7656 - mae: 3.0755 - val_loss: 23.7757 - val_mae: 3.6150 Epoch 12/200 301/301 [==============================] - 0s 133us/sample - loss: 17.4094 - mae: 3.1305 - val_loss: 17.4092 - val_mae: 3.1287 Epoch 13/200 301/301 [==============================] - 0s 131us/sample - loss: 15.7730 - mae: 3.0248 - val_loss: 18.0417 - val_mae: 3.2562 Epoch 14/200 301/301 [==============================] - 0s 140us/sample - loss: 16.6423 - mae: 3.2106 - val_loss: 21.0907 - val_mae: 3.5264 Epoch 15/200 301/301 [==============================] - 0s 130us/sample - loss: 14.5338 - mae: 2.8597 - val_loss: 17.3618 - val_mae: 3.2287 Epoch 16/200 301/301 [==============================] - 0s 140us/sample - loss: 14.4245 - mae: 2.8952 - val_loss: 24.9347 - val_mae: 3.9122 Epoch 17/200 301/301 [==============================] - 0s 131us/sample - loss: 15.9145 - mae: 3.0738 - val_loss: 23.1844 - val_mae: 3.7404 Epoch 18/200 301/301 [==============================] - 0s 135us/sample - loss: 15.6310 - mae: 3.0431 - val_loss: 16.4598 - val_mae: 3.0253 Epoch 19/200 301/301 [==============================] - 0s 129us/sample - loss: 13.5195 - mae: 2.7345 - val_loss: 19.3046 - val_mae: 3.2612 Epoch 20/200 301/301 [==============================] - 0s 132us/sample - loss: 13.8592 - mae: 2.8451 - val_loss: 16.8025 - val_mae: 3.1006 Epoch 21/200 301/301 [==============================] - 0s 135us/sample - loss: 13.2858 - mae: 2.7359 - val_loss: 18.5252 - val_mae: 3.2000 Epoch 22/200 301/301 [==============================] - 0s 129us/sample - loss: 13.0773 - mae: 2.7174 - val_loss: 20.1193 - val_mae: 3.4546 Epoch 23/200 301/301 [==============================] - 0s 137us/sample - loss: 14.9428 - mae: 2.9061 - val_loss: 17.8036 - val_mae: 3.1096 Epoch 24/200 301/301 [==============================] - 0s 130us/sample - loss: 13.0113 - mae: 2.7896 - val_loss: 18.3619 - val_mae: 3.3515 Epoch 25/200 301/301 [==============================] - 0s 133us/sample - loss: 13.5350 - mae: 2.7255 - val_loss: 21.2993 - val_mae: 3.4479 Epoch 26/200 301/301 [==============================] - 0s 129us/sample - loss: 13.0980 - mae: 2.7229 - val_loss: 17.1253 - val_mae: 3.1235 Epoch 27/200 301/301 [==============================] - 0s 130us/sample - loss: 13.1695 - mae: 2.7315 - val_loss: 19.5734 - val_mae: 3.1847 Epoch 28/200 301/301 [==============================] - 0s 137us/sample - loss: 13.9601 - mae: 2.8115 - val_loss: 17.1228 - val_mae: 2.9720 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 44us/sample - loss: 17.1831 - mae: 2.6091 [CV] END learning_rate=0.008782805365782789, n_hidden=1, n_neurons=85; total time= 1.8s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 0s 2ms/sample - loss: 415.6578 - mae: 18.8860 - val_loss: 114.3110 - val_mae: 9.5160 Epoch 2/200 301/301 [==============================] - 0s 136us/sample - loss: 52.2041 - mae: 5.6313 - val_loss: 27.3456 - val_mae: 4.1757 Epoch 3/200 301/301 [==============================] - 0s 127us/sample - loss: 25.2093 - mae: 3.7882 - val_loss: 20.9867 - val_mae: 3.3171 Epoch 4/200 301/301 [==============================] - 0s 134us/sample - loss: 21.3540 - mae: 3.4380 - val_loss: 22.8133 - val_mae: 3.4576 Epoch 5/200 301/301 [==============================] - 0s 132us/sample - loss: 21.7043 - mae: 3.5342 - val_loss: 23.0408 - val_mae: 3.5153 Epoch 6/200 301/301 [==============================] - 0s 133us/sample - loss: 18.5627 - mae: 3.3008 - val_loss: 20.9704 - val_mae: 3.2826 Epoch 7/200 301/301 [==============================] - 0s 135us/sample - loss: 18.9311 - mae: 3.2655 - val_loss: 22.9263 - val_mae: 3.7906 Epoch 8/200 301/301 [==============================] - 1s 2ms/sample - loss: 19.0280 - mae: 3.3004 - val_loss: 19.2124 - val_mae: 3.2288 Epoch 9/200 301/301 [==============================] - 0s 199us/sample - loss: 18.4743 - mae: 3.2031 - val_loss: 17.6557 - val_mae: 3.0811 Epoch 10/200 301/301 [==============================] - 0s 133us/sample - loss: 16.6528 - mae: 3.0854 - val_loss: 20.2683 - val_mae: 3.3504 Epoch 11/200 301/301 [==============================] - 0s 129us/sample - loss: 16.1573 - mae: 3.0004 - val_loss: 17.2117 - val_mae: 2.9958 Epoch 12/200 301/301 [==============================] - 0s 129us/sample - loss: 15.8755 - mae: 2.9315 - val_loss: 18.9604 - val_mae: 3.1332 Epoch 13/200 301/301 [==============================] - 0s 131us/sample - loss: 16.2709 - mae: 3.0171 - val_loss: 19.1203 - val_mae: 3.2978 Epoch 14/200 301/301 [==============================] - 0s 132us/sample - loss: 15.1859 - mae: 2.9445 - val_loss: 17.3543 - val_mae: 3.1774 Epoch 15/200 301/301 [==============================] - 0s 127us/sample - loss: 16.2883 - mae: 3.0700 - val_loss: 21.7164 - val_mae: 3.6996 Epoch 16/200 301/301 [==============================] - 0s 136us/sample - loss: 14.5373 - mae: 2.9255 - val_loss: 18.4981 - val_mae: 3.3722 Epoch 17/200 301/301 [==============================] - 0s 132us/sample - loss: 15.5303 - mae: 3.0352 - val_loss: 17.0791 - val_mae: 3.1501 Epoch 18/200 301/301 [==============================] - 0s 133us/sample - loss: 13.7562 - mae: 2.7816 - val_loss: 15.7687 - val_mae: 2.9857 Epoch 19/200 301/301 [==============================] - 0s 132us/sample - loss: 13.9533 - mae: 2.9139 - val_loss: 18.1495 - val_mae: 3.3379 Epoch 20/200 301/301 [==============================] - 0s 133us/sample - loss: 14.9591 - mae: 3.0732 - val_loss: 17.2135 - val_mae: 3.2748 Epoch 21/200 301/301 [==============================] - 0s 131us/sample - loss: 14.3264 - mae: 2.9848 - val_loss: 15.5614 - val_mae: 2.8039 Epoch 22/200 301/301 [==============================] - 0s 138us/sample - loss: 13.4500 - mae: 2.7415 - val_loss: 15.8480 - val_mae: 2.9245 Epoch 23/200 301/301 [==============================] - 0s 134us/sample - loss: 12.6600 - mae: 2.7011 - val_loss: 18.5475 - val_mae: 3.0922 Epoch 24/200 301/301 [==============================] - 0s 131us/sample - loss: 13.4226 - mae: 2.7743 - val_loss: 19.1258 - val_mae: 3.1525 Epoch 25/200 301/301 [==============================] - 0s 136us/sample - loss: 13.0144 - mae: 2.7188 - val_loss: 20.5982 - val_mae: 3.4805 Epoch 26/200 301/301 [==============================] - 0s 131us/sample - loss: 13.2284 - mae: 2.7259 - val_loss: 21.8158 - val_mae: 3.7048 Epoch 27/200 301/301 [==============================] - 0s 129us/sample - loss: 13.9669 - mae: 2.8223 - val_loss: 26.1183 - val_mae: 4.3213 Epoch 28/200 301/301 [==============================] - 0s 131us/sample - loss: 14.8574 - mae: 2.9454 - val_loss: 16.5999 - val_mae: 3.0813 Epoch 29/200 301/301 [==============================] - 0s 130us/sample - loss: 12.5405 - mae: 2.7089 - val_loss: 14.4290 - val_mae: 2.8191 Epoch 30/200 301/301 [==============================] - 0s 130us/sample - loss: 12.9105 - mae: 2.6653 - val_loss: 15.5264 - val_mae: 2.9036 Epoch 31/200 301/301 [==============================] - 0s 131us/sample - loss: 12.1274 - mae: 2.5892 - val_loss: 17.7386 - val_mae: 3.2712 Epoch 32/200 301/301 [==============================] - 0s 130us/sample - loss: 13.4966 - mae: 2.8219 - val_loss: 16.6999 - val_mae: 3.2107 Epoch 33/200 301/301 [==============================] - 0s 135us/sample - loss: 12.1616 - mae: 2.6531 - val_loss: 19.1982 - val_mae: 3.0907 Epoch 34/200 301/301 [==============================] - 0s 133us/sample - loss: 11.9244 - mae: 2.6219 - val_loss: 16.5362 - val_mae: 2.9443 Epoch 35/200 301/301 [==============================] - 0s 133us/sample - loss: 12.3717 - mae: 2.6992 - val_loss: 14.3851 - val_mae: 2.7676 Epoch 36/200 301/301 [==============================] - 0s 133us/sample - loss: 12.5179 - mae: 2.6422 - val_loss: 17.3592 - val_mae: 3.3164 Epoch 37/200 301/301 [==============================] - 0s 135us/sample - loss: 13.7124 - mae: 2.7701 - val_loss: 15.1959 - val_mae: 2.9408 Epoch 38/200 301/301 [==============================] - 0s 130us/sample - loss: 11.2058 - mae: 2.4705 - val_loss: 16.3582 - val_mae: 3.0917 Epoch 39/200 301/301 [==============================] - 0s 133us/sample - loss: 11.2685 - mae: 2.4579 - val_loss: 13.9348 - val_mae: 2.6861 Epoch 40/200 301/301 [==============================] - 0s 135us/sample - loss: 11.3068 - mae: 2.5208 - val_loss: 16.3650 - val_mae: 2.8398 Epoch 41/200 301/301 [==============================] - 0s 131us/sample - loss: 13.3921 - mae: 2.7341 - val_loss: 16.4271 - val_mae: 3.2041 Epoch 42/200 301/301 [==============================] - 0s 133us/sample - loss: 12.7437 - mae: 2.6031 - val_loss: 14.3342 - val_mae: 2.6789 Epoch 43/200 301/301 [==============================] - 0s 129us/sample - loss: 11.3732 - mae: 2.4719 - val_loss: 17.2119 - val_mae: 3.3200 Epoch 44/200 301/301 [==============================] - 0s 137us/sample - loss: 12.0355 - mae: 2.5950 - val_loss: 15.3645 - val_mae: 2.8289 Epoch 45/200 301/301 [==============================] - 0s 131us/sample - loss: 10.8407 - mae: 2.4421 - val_loss: 15.8225 - val_mae: 3.0761 Epoch 46/200 301/301 [==============================] - 0s 128us/sample - loss: 12.1449 - mae: 2.6789 - val_loss: 18.4165 - val_mae: 3.3464 Epoch 47/200 301/301 [==============================] - 0s 131us/sample - loss: 11.8483 - mae: 2.5966 - val_loss: 16.2070 - val_mae: 2.9494 Epoch 48/200 301/301 [==============================] - 0s 132us/sample - loss: 11.7889 - mae: 2.5231 - val_loss: 16.7702 - val_mae: 3.1571 Epoch 49/200 301/301 [==============================] - 0s 133us/sample - loss: 11.1601 - mae: 2.5156 - val_loss: 13.6498 - val_mae: 2.7139 Epoch 50/200 301/301 [==============================] - 0s 128us/sample - loss: 11.2187 - mae: 2.5353 - val_loss: 15.9541 - val_mae: 3.1426 Epoch 51/200 301/301 [==============================] - 0s 135us/sample - loss: 11.5044 - mae: 2.5640 - val_loss: 14.1311 - val_mae: 2.7516 Epoch 52/200 301/301 [==============================] - 0s 137us/sample - loss: 10.2903 - mae: 2.3401 - val_loss: 14.9323 - val_mae: 2.9426 Epoch 53/200 301/301 [==============================] - 0s 130us/sample - loss: 11.9078 - mae: 2.5785 - val_loss: 16.1343 - val_mae: 2.8122 Epoch 54/200 301/301 [==============================] - 0s 132us/sample - loss: 11.1203 - mae: 2.5225 - val_loss: 15.5303 - val_mae: 2.9072 Epoch 55/200 301/301 [==============================] - 0s 132us/sample - loss: 10.7882 - mae: 2.4382 - val_loss: 18.5367 - val_mae: 3.3995 Epoch 56/200 301/301 [==============================] - 0s 135us/sample - loss: 11.5562 - mae: 2.5763 - val_loss: 16.3619 - val_mae: 3.0194 Epoch 57/200 301/301 [==============================] - 0s 130us/sample - loss: 12.3856 - mae: 2.6656 - val_loss: 14.4499 - val_mae: 2.7426 Epoch 58/200 301/301 [==============================] - 0s 132us/sample - loss: 10.7959 - mae: 2.4227 - val_loss: 13.7724 - val_mae: 2.7535 Epoch 59/200 301/301 [==============================] - 0s 133us/sample - loss: 9.8723 - mae: 2.2625 - val_loss: 20.1615 - val_mae: 3.7630 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 44us/sample - loss: 17.0915 - mae: 4.0219 [CV] END learning_rate=0.008782805365782789, n_hidden=1, n_neurons=85; total time= 3.5s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 0s 2ms/sample - loss: 398.1240 - mae: 18.5357 - val_loss: 102.2401 - val_mae: 8.6761 Epoch 2/200 302/302 [==============================] - 0s 133us/sample - loss: 38.4542 - mae: 4.7477 - val_loss: 29.8375 - val_mae: 4.5205 Epoch 3/200 302/302 [==============================] - 0s 136us/sample - loss: 21.5864 - mae: 3.5714 - val_loss: 23.1682 - val_mae: 3.7457 Epoch 4/200 302/302 [==============================] - 0s 128us/sample - loss: 18.6338 - mae: 3.1783 - val_loss: 23.4833 - val_mae: 3.8678 Epoch 5/200 302/302 [==============================] - 0s 130us/sample - loss: 19.1874 - mae: 3.2311 - val_loss: 23.2737 - val_mae: 3.7758 Epoch 6/200 302/302 [==============================] - 0s 135us/sample - loss: 17.7646 - mae: 3.1281 - val_loss: 21.1056 - val_mae: 3.4744 Epoch 7/200 302/302 [==============================] - 0s 129us/sample - loss: 18.4004 - mae: 3.1338 - val_loss: 20.9040 - val_mae: 3.4149 Epoch 8/200 302/302 [==============================] - 0s 130us/sample - loss: 17.4672 - mae: 3.0945 - val_loss: 25.0352 - val_mae: 3.6920 Epoch 9/200 302/302 [==============================] - 0s 130us/sample - loss: 19.4488 - mae: 3.2974 - val_loss: 18.5161 - val_mae: 3.2154 Epoch 10/200 302/302 [==============================] - 0s 134us/sample - loss: 17.1524 - mae: 3.0506 - val_loss: 20.7586 - val_mae: 3.4363 Epoch 11/200 302/302 [==============================] - 0s 132us/sample - loss: 16.7631 - mae: 3.0177 - val_loss: 17.9787 - val_mae: 3.1788 Epoch 12/200 302/302 [==============================] - 0s 131us/sample - loss: 16.2489 - mae: 2.9566 - val_loss: 20.4277 - val_mae: 3.2945 Epoch 13/200 302/302 [==============================] - 0s 135us/sample - loss: 15.5473 - mae: 2.9178 - val_loss: 18.5231 - val_mae: 3.1797 Epoch 14/200 302/302 [==============================] - 0s 130us/sample - loss: 17.7536 - mae: 3.1817 - val_loss: 19.8683 - val_mae: 3.3542 Epoch 15/200 302/302 [==============================] - 0s 133us/sample - loss: 16.5297 - mae: 3.1061 - val_loss: 18.0955 - val_mae: 3.1699 Epoch 16/200 302/302 [==============================] - 0s 127us/sample - loss: 16.9132 - mae: 3.1763 - val_loss: 23.7061 - val_mae: 3.9728 Epoch 17/200 302/302 [==============================] - 0s 136us/sample - loss: 16.1507 - mae: 3.0992 - val_loss: 17.4413 - val_mae: 3.1877 Epoch 18/200 302/302 [==============================] - 0s 130us/sample - loss: 15.0559 - mae: 2.9304 - val_loss: 16.5557 - val_mae: 3.0428 Epoch 19/200 302/302 [==============================] - 0s 134us/sample - loss: 13.9144 - mae: 2.7783 - val_loss: 20.2775 - val_mae: 3.3266 Epoch 20/200 302/302 [==============================] - 0s 131us/sample - loss: 15.6449 - mae: 3.0375 - val_loss: 16.0713 - val_mae: 2.9917 Epoch 21/200 302/302 [==============================] - 0s 128us/sample - loss: 13.9496 - mae: 2.8162 - val_loss: 17.0190 - val_mae: 3.1308 Epoch 22/200 302/302 [==============================] - 0s 132us/sample - loss: 14.6639 - mae: 2.8604 - val_loss: 15.4841 - val_mae: 2.9651 Epoch 23/200 302/302 [==============================] - 0s 129us/sample - loss: 13.4068 - mae: 2.7553 - val_loss: 18.7557 - val_mae: 3.2170 Epoch 24/200 302/302 [==============================] - 0s 128us/sample - loss: 13.2260 - mae: 2.7450 - val_loss: 18.7043 - val_mae: 3.1760 Epoch 25/200 302/302 [==============================] - 0s 133us/sample - loss: 14.1151 - mae: 2.8089 - val_loss: 16.5066 - val_mae: 2.9808 Epoch 26/200 302/302 [==============================] - 0s 132us/sample - loss: 13.2890 - mae: 2.7379 - val_loss: 16.2853 - val_mae: 3.1098 Epoch 27/200 302/302 [==============================] - 0s 130us/sample - loss: 14.4209 - mae: 2.9048 - val_loss: 20.2612 - val_mae: 3.3456 Epoch 28/200 302/302 [==============================] - 0s 124us/sample - loss: 13.5174 - mae: 2.8219 - val_loss: 15.9396 - val_mae: 3.0735 Epoch 29/200 302/302 [==============================] - 0s 154us/sample - loss: 13.1252 - mae: 2.7192 - val_loss: 21.9658 - val_mae: 3.7145 Epoch 30/200 302/302 [==============================] - 0s 133us/sample - loss: 15.0839 - mae: 2.9295 - val_loss: 16.2588 - val_mae: 3.1243 Epoch 31/200 302/302 [==============================] - 0s 131us/sample - loss: 13.5563 - mae: 2.8187 - val_loss: 15.8855 - val_mae: 2.9447 Epoch 32/200 302/302 [==============================] - 0s 132us/sample - loss: 12.8533 - mae: 2.6926 - val_loss: 15.1633 - val_mae: 2.9626 Epoch 33/200 302/302 [==============================] - 0s 134us/sample - loss: 12.8136 - mae: 2.6988 - val_loss: 14.5788 - val_mae: 2.8306 Epoch 34/200 302/302 [==============================] - 0s 127us/sample - loss: 12.8039 - mae: 2.6389 - val_loss: 19.4946 - val_mae: 3.4756 Epoch 35/200 302/302 [==============================] - 0s 133us/sample - loss: 14.4405 - mae: 2.8305 - val_loss: 14.6325 - val_mae: 2.8268 Epoch 36/200 302/302 [==============================] - 0s 133us/sample - loss: 12.4031 - mae: 2.6506 - val_loss: 15.3789 - val_mae: 2.8743 Epoch 37/200 302/302 [==============================] - 0s 130us/sample - loss: 13.6933 - mae: 2.8173 - val_loss: 14.4300 - val_mae: 2.8021 Epoch 38/200 302/302 [==============================] - 0s 130us/sample - loss: 13.8384 - mae: 2.7640 - val_loss: 16.8646 - val_mae: 3.0266 Epoch 39/200 302/302 [==============================] - 0s 132us/sample - loss: 12.1373 - mae: 2.5928 - val_loss: 16.0994 - val_mae: 2.9825 Epoch 40/200 302/302 [==============================] - 0s 134us/sample - loss: 12.7634 - mae: 2.6704 - val_loss: 16.4529 - val_mae: 3.0401 Epoch 41/200 302/302 [==============================] - 0s 142us/sample - loss: 13.1233 - mae: 2.7428 - val_loss: 14.4828 - val_mae: 2.7794 Epoch 42/200 302/302 [==============================] - 0s 132us/sample - loss: 13.6009 - mae: 2.7296 - val_loss: 14.5517 - val_mae: 2.8243 Epoch 43/200 302/302 [==============================] - 0s 131us/sample - loss: 12.0331 - mae: 2.5805 - val_loss: 14.4823 - val_mae: 2.8385 Epoch 44/200 302/302 [==============================] - 0s 126us/sample - loss: 11.5391 - mae: 2.4977 - val_loss: 15.2932 - val_mae: 2.9642 Epoch 45/200 302/302 [==============================] - 0s 126us/sample - loss: 13.2202 - mae: 2.7470 - val_loss: 16.5796 - val_mae: 2.9505 Epoch 46/200 302/302 [==============================] - 0s 215us/sample - loss: 12.4796 - mae: 2.6814 - val_loss: 16.2234 - val_mae: 2.9617 Epoch 47/200 302/302 [==============================] - 0s 143us/sample - loss: 12.1658 - mae: 2.6470 - val_loss: 14.9529 - val_mae: 2.8740 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 45us/sample - loss: 9.8399 - mae: 2.7222 [CV] END learning_rate=0.008782805365782789, n_hidden=1, n_neurons=85; total time= 2.5s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 374.2941 - mae: 17.8518 - val_loss: 108.4459 - val_mae: 8.7679 Epoch 2/200 302/302 [==============================] - 0s 132us/sample - loss: 42.9387 - mae: 5.1755 - val_loss: 29.2367 - val_mae: 4.2600 Epoch 3/200 302/302 [==============================] - 0s 136us/sample - loss: 23.2552 - mae: 3.7100 - val_loss: 24.1182 - val_mae: 3.8157 Epoch 4/200 302/302 [==============================] - 0s 133us/sample - loss: 19.5335 - mae: 3.3804 - val_loss: 23.9112 - val_mae: 3.6102 Epoch 5/200 302/302 [==============================] - 0s 132us/sample - loss: 17.7793 - mae: 3.1748 - val_loss: 23.9003 - val_mae: 3.6765 Epoch 6/200 302/302 [==============================] - 0s 129us/sample - loss: 18.0197 - mae: 3.2086 - val_loss: 22.9094 - val_mae: 3.6481 Epoch 7/200 302/302 [==============================] - 0s 136us/sample - loss: 16.6199 - mae: 3.0210 - val_loss: 29.6793 - val_mae: 4.0226 Epoch 8/200 302/302 [==============================] - 0s 141us/sample - loss: 18.5403 - mae: 3.2900 - val_loss: 25.2502 - val_mae: 3.8427 Epoch 9/200 302/302 [==============================] - 0s 131us/sample - loss: 17.7769 - mae: 3.2534 - val_loss: 22.0209 - val_mae: 3.5201 Epoch 10/200 302/302 [==============================] - 0s 132us/sample - loss: 15.7443 - mae: 2.9973 - val_loss: 21.2117 - val_mae: 3.4887 Epoch 11/200 302/302 [==============================] - 0s 134us/sample - loss: 16.2066 - mae: 3.1010 - val_loss: 22.9131 - val_mae: 3.8407 Epoch 12/200 302/302 [==============================] - 0s 137us/sample - loss: 15.8056 - mae: 3.0543 - val_loss: 19.7800 - val_mae: 3.3382 Epoch 13/200 302/302 [==============================] - 0s 132us/sample - loss: 13.7060 - mae: 2.7958 - val_loss: 19.1990 - val_mae: 3.2356 Epoch 14/200 302/302 [==============================] - 0s 134us/sample - loss: 14.5167 - mae: 2.8174 - val_loss: 19.5523 - val_mae: 3.2770 Epoch 15/200 302/302 [==============================] - 0s 128us/sample - loss: 14.5014 - mae: 2.8447 - val_loss: 23.1698 - val_mae: 3.7585 Epoch 16/200 302/302 [==============================] - 0s 128us/sample - loss: 14.7979 - mae: 2.9758 - val_loss: 20.2854 - val_mae: 3.2302 Epoch 17/200 302/302 [==============================] - 0s 133us/sample - loss: 13.3968 - mae: 2.7574 - val_loss: 20.7655 - val_mae: 3.6005 Epoch 18/200 302/302 [==============================] - 0s 127us/sample - loss: 13.6675 - mae: 2.7789 - val_loss: 18.5549 - val_mae: 3.2970 Epoch 19/200 302/302 [==============================] - 0s 136us/sample - loss: 15.1197 - mae: 2.9246 - val_loss: 17.7293 - val_mae: 3.0206 Epoch 20/200 302/302 [==============================] - 0s 130us/sample - loss: 13.6465 - mae: 2.7763 - val_loss: 19.8763 - val_mae: 3.2052 Epoch 21/200 302/302 [==============================] - 0s 133us/sample - loss: 12.5249 - mae: 2.6746 - val_loss: 18.2296 - val_mae: 3.1903 Epoch 22/200 302/302 [==============================] - 0s 130us/sample - loss: 12.2458 - mae: 2.6938 - val_loss: 19.8642 - val_mae: 3.1843 Epoch 23/200 302/302 [==============================] - 0s 128us/sample - loss: 12.3152 - mae: 2.6509 - val_loss: 16.7277 - val_mae: 2.9529 Epoch 24/200 302/302 [==============================] - 0s 137us/sample - loss: 13.1220 - mae: 2.8249 - val_loss: 17.4598 - val_mae: 3.1232 Epoch 25/200 302/302 [==============================] - 0s 129us/sample - loss: 12.9706 - mae: 2.7220 - val_loss: 20.1719 - val_mae: 3.5356 Epoch 26/200 302/302 [==============================] - 0s 133us/sample - loss: 12.4130 - mae: 2.6977 - val_loss: 20.6423 - val_mae: 3.5865 Epoch 27/200 302/302 [==============================] - 0s 132us/sample - loss: 12.0164 - mae: 2.5679 - val_loss: 21.5256 - val_mae: 3.2837 Epoch 28/200 302/302 [==============================] - 0s 130us/sample - loss: 12.5124 - mae: 2.6215 - val_loss: 15.9668 - val_mae: 2.9887 Epoch 29/200 302/302 [==============================] - 0s 134us/sample - loss: 10.8283 - mae: 2.4114 - val_loss: 17.7855 - val_mae: 3.1110 Epoch 30/200 302/302 [==============================] - 0s 134us/sample - loss: 11.9582 - mae: 2.5556 - val_loss: 18.6177 - val_mae: 3.2857 Epoch 31/200 302/302 [==============================] - 0s 134us/sample - loss: 11.1352 - mae: 2.5049 - val_loss: 17.2921 - val_mae: 3.0116 Epoch 32/200 302/302 [==============================] - 0s 129us/sample - loss: 12.7039 - mae: 2.7275 - val_loss: 17.2822 - val_mae: 3.1674 Epoch 33/200 302/302 [==============================] - 0s 130us/sample - loss: 11.2241 - mae: 2.5135 - val_loss: 20.1723 - val_mae: 3.2125 Epoch 34/200 302/302 [==============================] - 0s 130us/sample - loss: 10.4389 - mae: 2.3905 - val_loss: 16.5159 - val_mae: 3.0201 Epoch 35/200 302/302 [==============================] - 0s 129us/sample - loss: 11.3573 - mae: 2.4721 - val_loss: 18.1971 - val_mae: 3.0817 Epoch 36/200 302/302 [==============================] - 0s 133us/sample - loss: 12.2611 - mae: 2.5918 - val_loss: 16.8918 - val_mae: 2.9660 Epoch 37/200 302/302 [==============================] - 0s 133us/sample - loss: 11.0148 - mae: 2.4725 - val_loss: 19.6820 - val_mae: 3.2230 Epoch 38/200 302/302 [==============================] - 0s 131us/sample - loss: 11.5290 - mae: 2.5331 - val_loss: 15.7601 - val_mae: 2.8295 Epoch 39/200 302/302 [==============================] - 0s 130us/sample - loss: 10.2677 - mae: 2.4053 - val_loss: 17.6045 - val_mae: 3.2575 Epoch 40/200 302/302 [==============================] - 0s 135us/sample - loss: 11.3188 - mae: 2.5060 - val_loss: 17.2700 - val_mae: 2.9747 Epoch 41/200 302/302 [==============================] - 0s 133us/sample - loss: 11.4699 - mae: 2.5737 - val_loss: 16.4227 - val_mae: 2.9842 Epoch 42/200 302/302 [==============================] - 0s 132us/sample - loss: 10.8168 - mae: 2.4215 - val_loss: 17.7775 - val_mae: 3.1182 Epoch 43/200 302/302 [==============================] - 0s 133us/sample - loss: 10.5772 - mae: 2.4463 - val_loss: 16.3629 - val_mae: 2.9638 Epoch 44/200 302/302 [==============================] - 0s 131us/sample - loss: 10.5355 - mae: 2.3874 - val_loss: 17.2317 - val_mae: 3.1370 Epoch 45/200 302/302 [==============================] - 0s 130us/sample - loss: 10.1975 - mae: 2.3851 - val_loss: 17.1621 - val_mae: 3.1010 Epoch 46/200 302/302 [==============================] - 0s 128us/sample - loss: 9.8049 - mae: 2.3080 - val_loss: 17.2698 - val_mae: 3.0089 Epoch 47/200 302/302 [==============================] - 0s 130us/sample - loss: 10.6296 - mae: 2.3567 - val_loss: 19.7804 - val_mae: 3.1632 Epoch 48/200 302/302 [==============================] - 0s 137us/sample - loss: 11.3102 - mae: 2.5610 - val_loss: 18.2868 - val_mae: 3.0220 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 44us/sample - loss: 12.7636 - mae: 2.8892 [CV] END learning_rate=0.008782805365782789, n_hidden=1, n_neurons=85; total time= 2.6s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 0s 2ms/sample - loss: 605.0693 - mae: 22.3140 - val_loss: 545.0148 - val_mae: 21.4531 Epoch 2/200 301/301 [==============================] - 0s 125us/sample - loss: 550.0312 - mae: 21.6643 - val_loss: 498.6935 - val_mae: 20.8467 Epoch 3/200 301/301 [==============================] - 0s 121us/sample - loss: 503.6905 - mae: 21.0115 - val_loss: 454.9574 - val_mae: 20.1443 Epoch 4/200 301/301 [==============================] - 0s 120us/sample - loss: 459.0730 - mae: 20.1908 - val_loss: 413.5763 - val_mae: 19.3005 Epoch 5/200 301/301 [==============================] - 0s 118us/sample - loss: 413.3976 - mae: 19.2378 - val_loss: 369.8279 - val_mae: 18.2808 Epoch 6/200 301/301 [==============================] - 0s 119us/sample - loss: 367.3443 - mae: 18.1087 - val_loss: 324.9004 - val_mae: 17.0877 Epoch 7/200 301/301 [==============================] - 0s 123us/sample - loss: 320.8689 - mae: 16.8415 - val_loss: 279.7440 - val_mae: 15.7390 Epoch 8/200 301/301 [==============================] - 0s 123us/sample - loss: 273.7309 - mae: 15.4121 - val_loss: 234.3580 - val_mae: 14.2650 Epoch 9/200 301/301 [==============================] - 0s 120us/sample - loss: 227.4153 - mae: 13.8534 - val_loss: 191.4609 - val_mae: 12.6845 Epoch 10/200 301/301 [==============================] - 0s 118us/sample - loss: 185.4468 - mae: 12.2129 - val_loss: 153.7243 - val_mae: 11.1754 Epoch 11/200 301/301 [==============================] - 0s 116us/sample - loss: 148.1182 - mae: 10.6856 - val_loss: 120.2578 - val_mae: 9.7261 Epoch 12/200 301/301 [==============================] - 0s 117us/sample - loss: 115.2665 - mae: 9.2206 - val_loss: 91.6041 - val_mae: 8.3338 Epoch 13/200 301/301 [==============================] - 0s 123us/sample - loss: 87.7807 - mae: 7.8474 - val_loss: 68.2592 - val_mae: 7.0356 Epoch 14/200 301/301 [==============================] - 0s 115us/sample - loss: 65.3039 - mae: 6.5854 - val_loss: 49.8932 - val_mae: 5.7790 Epoch 15/200 301/301 [==============================] - 0s 117us/sample - loss: 48.6563 - mae: 5.5402 - val_loss: 37.0543 - val_mae: 4.7582 Epoch 16/200 301/301 [==============================] - 0s 116us/sample - loss: 36.8898 - mae: 4.6184 - val_loss: 29.3985 - val_mae: 4.0609 Epoch 17/200 301/301 [==============================] - 0s 116us/sample - loss: 29.4964 - mae: 4.1434 - val_loss: 25.3763 - val_mae: 3.6702 Epoch 18/200 301/301 [==============================] - 0s 116us/sample - loss: 26.0492 - mae: 3.9065 - val_loss: 24.2061 - val_mae: 3.6235 Epoch 19/200 301/301 [==============================] - 0s 116us/sample - loss: 24.1884 - mae: 3.7370 - val_loss: 23.4140 - val_mae: 3.6076 Epoch 20/200 301/301 [==============================] - 0s 114us/sample - loss: 23.3003 - mae: 3.7065 - val_loss: 22.8992 - val_mae: 3.6137 Epoch 21/200 301/301 [==============================] - 0s 118us/sample - loss: 22.6083 - mae: 3.6612 - val_loss: 22.9555 - val_mae: 3.6454 Epoch 22/200 301/301 [==============================] - 0s 119us/sample - loss: 21.9247 - mae: 3.5835 - val_loss: 22.4637 - val_mae: 3.6102 Epoch 23/200 301/301 [==============================] - 0s 118us/sample - loss: 21.4339 - mae: 3.5522 - val_loss: 22.1210 - val_mae: 3.5969 Epoch 24/200 301/301 [==============================] - 0s 119us/sample - loss: 20.8853 - mae: 3.5087 - val_loss: 22.0102 - val_mae: 3.5977 Epoch 25/200 301/301 [==============================] - 0s 118us/sample - loss: 20.5916 - mae: 3.4826 - val_loss: 21.9890 - val_mae: 3.6049 Epoch 26/200 301/301 [==============================] - 0s 120us/sample - loss: 20.3706 - mae: 3.4456 - val_loss: 21.5613 - val_mae: 3.5544 Epoch 27/200 301/301 [==============================] - 0s 122us/sample - loss: 19.9572 - mae: 3.4491 - val_loss: 21.8955 - val_mae: 3.6235 Epoch 28/200 301/301 [==============================] - 0s 121us/sample - loss: 19.6385 - mae: 3.3834 - val_loss: 21.2808 - val_mae: 3.5292 Epoch 29/200 301/301 [==============================] - 0s 121us/sample - loss: 19.9371 - mae: 3.3817 - val_loss: 21.1116 - val_mae: 3.5140 Epoch 30/200 301/301 [==============================] - 0s 119us/sample - loss: 19.4837 - mae: 3.3815 - val_loss: 21.2553 - val_mae: 3.5573 Epoch 31/200 301/301 [==============================] - 0s 126us/sample - loss: 18.9390 - mae: 3.3193 - val_loss: 20.8502 - val_mae: 3.4913 Epoch 32/200 301/301 [==============================] - 0s 121us/sample - loss: 18.8020 - mae: 3.2950 - val_loss: 20.9236 - val_mae: 3.5106 Epoch 33/200 301/301 [==============================] - 0s 121us/sample - loss: 18.5310 - mae: 3.2611 - val_loss: 20.7946 - val_mae: 3.4936 Epoch 34/200 301/301 [==============================] - 0s 119us/sample - loss: 18.5739 - mae: 3.2821 - val_loss: 20.6988 - val_mae: 3.4800 Epoch 35/200 301/301 [==============================] - 0s 120us/sample - loss: 18.4333 - mae: 3.2549 - val_loss: 20.5666 - val_mae: 3.4679 Epoch 36/200 301/301 [==============================] - 0s 122us/sample - loss: 18.2168 - mae: 3.2146 - val_loss: 20.4105 - val_mae: 3.4549 Epoch 37/200 301/301 [==============================] - 0s 122us/sample - loss: 17.9710 - mae: 3.1992 - val_loss: 20.3434 - val_mae: 3.4564 Epoch 38/200 301/301 [==============================] - 0s 120us/sample - loss: 17.7750 - mae: 3.1834 - val_loss: 20.3207 - val_mae: 3.4730 Epoch 39/200 301/301 [==============================] - 0s 122us/sample - loss: 17.7745 - mae: 3.1808 - val_loss: 20.2549 - val_mae: 3.4399 Epoch 40/200 301/301 [==============================] - 0s 124us/sample - loss: 17.7032 - mae: 3.1546 - val_loss: 20.0981 - val_mae: 3.4221 Epoch 41/200 301/301 [==============================] - 0s 118us/sample - loss: 17.6117 - mae: 3.1780 - val_loss: 20.2461 - val_mae: 3.4847 Epoch 42/200 301/301 [==============================] - 0s 125us/sample - loss: 17.3149 - mae: 3.1219 - val_loss: 19.7649 - val_mae: 3.4011 Epoch 43/200 301/301 [==============================] - 0s 125us/sample - loss: 17.4388 - mae: 3.1236 - val_loss: 19.5916 - val_mae: 3.3837 Epoch 44/200 301/301 [==============================] - 0s 127us/sample - loss: 17.0391 - mae: 3.1106 - val_loss: 19.7758 - val_mae: 3.4497 Epoch 45/200 301/301 [==============================] - 0s 119us/sample - loss: 17.2350 - mae: 3.1005 - val_loss: 19.5530 - val_mae: 3.3837 Epoch 46/200 301/301 [==============================] - 0s 126us/sample - loss: 17.0341 - mae: 3.0860 - val_loss: 19.7196 - val_mae: 3.4389 Epoch 47/200 301/301 [==============================] - 0s 123us/sample - loss: 16.6865 - mae: 3.0482 - val_loss: 19.2217 - val_mae: 3.3538 Epoch 48/200 301/301 [==============================] - 0s 118us/sample - loss: 16.8795 - mae: 3.0442 - val_loss: 19.0282 - val_mae: 3.3352 Epoch 49/200 301/301 [==============================] - 0s 125us/sample - loss: 16.7730 - mae: 3.0551 - val_loss: 18.9375 - val_mae: 3.3270 Epoch 50/200 301/301 [==============================] - 0s 125us/sample - loss: 16.4802 - mae: 3.0324 - val_loss: 19.2448 - val_mae: 3.3612 Epoch 51/200 301/301 [==============================] - 0s 119us/sample - loss: 16.5153 - mae: 3.0522 - val_loss: 19.4292 - val_mae: 3.4095 Epoch 52/200 301/301 [==============================] - 0s 121us/sample - loss: 16.4988 - mae: 3.0160 - val_loss: 18.8846 - val_mae: 3.3154 Epoch 53/200 301/301 [==============================] - 0s 122us/sample - loss: 16.1466 - mae: 2.9883 - val_loss: 18.5410 - val_mae: 3.2888 Epoch 54/200 301/301 [==============================] - 0s 122us/sample - loss: 16.1745 - mae: 2.9937 - val_loss: 18.7020 - val_mae: 3.3208 Epoch 55/200 301/301 [==============================] - 0s 124us/sample - loss: 16.0539 - mae: 2.9539 - val_loss: 18.5503 - val_mae: 3.2938 Epoch 56/200 301/301 [==============================] - 0s 118us/sample - loss: 16.0167 - mae: 2.9664 - val_loss: 18.2565 - val_mae: 3.2418 Epoch 57/200 301/301 [==============================] - 0s 122us/sample - loss: 16.3104 - mae: 3.0308 - val_loss: 18.2372 - val_mae: 3.2643 Epoch 58/200 301/301 [==============================] - 0s 122us/sample - loss: 15.8777 - mae: 2.9556 - val_loss: 18.1957 - val_mae: 3.2473 Epoch 59/200 301/301 [==============================] - 0s 123us/sample - loss: 15.5891 - mae: 2.9320 - val_loss: 18.1278 - val_mae: 3.2450 Epoch 60/200 301/301 [==============================] - 0s 122us/sample - loss: 15.5609 - mae: 2.9356 - val_loss: 18.2151 - val_mae: 3.2529 Epoch 61/200 301/301 [==============================] - 0s 122us/sample - loss: 15.4188 - mae: 2.9133 - val_loss: 18.0792 - val_mae: 3.2503 Epoch 62/200 301/301 [==============================] - 0s 116us/sample - loss: 15.4374 - mae: 2.9200 - val_loss: 18.0472 - val_mae: 3.2176 Epoch 63/200 301/301 [==============================] - 0s 117us/sample - loss: 15.4632 - mae: 2.9416 - val_loss: 18.2165 - val_mae: 3.2738 Epoch 64/200 301/301 [==============================] - 0s 116us/sample - loss: 15.1443 - mae: 2.8671 - val_loss: 17.7800 - val_mae: 3.1816 Epoch 65/200 301/301 [==============================] - 0s 116us/sample - loss: 15.3151 - mae: 2.8776 - val_loss: 17.7542 - val_mae: 3.1752 Epoch 66/200 301/301 [==============================] - 0s 118us/sample - loss: 15.2872 - mae: 2.9002 - val_loss: 17.5472 - val_mae: 3.1765 Epoch 67/200 301/301 [==============================] - 0s 120us/sample - loss: 15.0157 - mae: 2.8609 - val_loss: 17.6483 - val_mae: 3.1807 Epoch 68/200 301/301 [==============================] - 0s 119us/sample - loss: 14.9769 - mae: 2.8574 - val_loss: 17.4434 - val_mae: 3.1518 Epoch 69/200 301/301 [==============================] - 0s 123us/sample - loss: 14.9233 - mae: 2.8683 - val_loss: 17.9638 - val_mae: 3.2400 Epoch 70/200 301/301 [==============================] - 0s 118us/sample - loss: 14.7114 - mae: 2.8025 - val_loss: 17.6011 - val_mae: 3.1543 Epoch 71/200 301/301 [==============================] - 0s 120us/sample - loss: 15.1015 - mae: 2.8787 - val_loss: 17.4322 - val_mae: 3.1438 Epoch 72/200 301/301 [==============================] - 0s 123us/sample - loss: 14.7286 - mae: 2.8385 - val_loss: 17.2788 - val_mae: 3.1298 Epoch 73/200 301/301 [==============================] - 0s 121us/sample - loss: 14.5584 - mae: 2.8345 - val_loss: 17.2092 - val_mae: 3.1225 Epoch 74/200 301/301 [==============================] - 0s 119us/sample - loss: 14.5332 - mae: 2.8146 - val_loss: 17.1070 - val_mae: 3.1110 Epoch 75/200 301/301 [==============================] - 0s 119us/sample - loss: 14.3858 - mae: 2.7990 - val_loss: 17.3526 - val_mae: 3.1561 Epoch 76/200 301/301 [==============================] - 0s 120us/sample - loss: 14.5994 - mae: 2.8056 - val_loss: 17.1146 - val_mae: 3.1033 Epoch 77/200 301/301 [==============================] - 0s 124us/sample - loss: 14.5253 - mae: 2.7939 - val_loss: 16.9778 - val_mae: 3.0878 Epoch 78/200 301/301 [==============================] - 0s 122us/sample - loss: 14.1485 - mae: 2.7730 - val_loss: 16.9012 - val_mae: 3.0796 Epoch 79/200 301/301 [==============================] - 0s 120us/sample - loss: 14.2219 - mae: 2.7567 - val_loss: 16.7951 - val_mae: 3.0728 Epoch 80/200 301/301 [==============================] - 0s 117us/sample - loss: 14.0202 - mae: 2.7571 - val_loss: 16.7773 - val_mae: 3.0673 Epoch 81/200 301/301 [==============================] - 0s 124us/sample - loss: 13.9652 - mae: 2.7693 - val_loss: 17.0990 - val_mae: 3.1330 Epoch 82/200 301/301 [==============================] - 0s 123us/sample - loss: 14.0819 - mae: 2.7496 - val_loss: 16.7055 - val_mae: 3.0520 Epoch 83/200 301/301 [==============================] - 0s 120us/sample - loss: 13.8976 - mae: 2.7574 - val_loss: 16.7244 - val_mae: 3.0736 Epoch 84/200 301/301 [==============================] - 0s 119us/sample - loss: 13.8034 - mae: 2.7300 - val_loss: 16.4160 - val_mae: 3.0303 Epoch 85/200 301/301 [==============================] - 0s 117us/sample - loss: 13.7189 - mae: 2.7171 - val_loss: 16.2960 - val_mae: 3.0129 Epoch 86/200 301/301 [==============================] - 0s 122us/sample - loss: 13.5944 - mae: 2.7271 - val_loss: 16.8948 - val_mae: 3.0944 Epoch 87/200 301/301 [==============================] - 0s 124us/sample - loss: 13.6705 - mae: 2.7007 - val_loss: 16.0808 - val_mae: 2.9899 Epoch 88/200 301/301 [==============================] - 0s 126us/sample - loss: 13.6086 - mae: 2.7257 - val_loss: 16.3483 - val_mae: 3.0361 Epoch 89/200 301/301 [==============================] - 0s 118us/sample - loss: 13.5893 - mae: 2.7070 - val_loss: 16.2974 - val_mae: 3.0130 Epoch 90/200 301/301 [==============================] - 0s 123us/sample - loss: 13.4596 - mae: 2.6890 - val_loss: 16.2557 - val_mae: 2.9991 Epoch 91/200 301/301 [==============================] - 0s 127us/sample - loss: 13.4470 - mae: 2.6990 - val_loss: 16.2134 - val_mae: 2.9988 Epoch 92/200 301/301 [==============================] - 0s 120us/sample - loss: 13.2421 - mae: 2.6861 - val_loss: 16.3483 - val_mae: 2.9998 Epoch 93/200 301/301 [==============================] - 0s 123us/sample - loss: 13.3118 - mae: 2.6853 - val_loss: 16.3244 - val_mae: 2.9960 Epoch 94/200 301/301 [==============================] - 0s 122us/sample - loss: 13.2576 - mae: 2.6856 - val_loss: 16.4283 - val_mae: 3.0137 Epoch 95/200 301/301 [==============================] - 0s 124us/sample - loss: 13.1833 - mae: 2.6654 - val_loss: 16.3141 - val_mae: 3.0039 Epoch 96/200 301/301 [==============================] - 0s 119us/sample - loss: 13.1088 - mae: 2.6582 - val_loss: 16.2310 - val_mae: 3.0056 Epoch 97/200 301/301 [==============================] - 0s 117us/sample - loss: 13.3443 - mae: 2.6880 - val_loss: 16.1252 - val_mae: 2.9760 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 37us/sample - loss: 15.4843 - mae: 2.6630 [CV] END learning_rate=0.002049356284410796, n_hidden=1, n_neurons=13; total time= 4.1s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 0s 2ms/sample - loss: 566.7206 - mae: 21.9104 - val_loss: 528.9960 - val_mae: 21.4291 Epoch 2/200 301/301 [==============================] - 0s 126us/sample - loss: 517.8900 - mae: 21.2687 - val_loss: 484.5268 - val_mae: 20.7480 Epoch 3/200 301/301 [==============================] - 0s 123us/sample - loss: 476.2902 - mae: 20.5860 - val_loss: 441.0520 - val_mae: 19.9629 Epoch 4/200 301/301 [==============================] - 0s 124us/sample - loss: 432.4408 - mae: 19.7689 - val_loss: 395.0108 - val_mae: 18.9961 Epoch 5/200 301/301 [==============================] - 0s 122us/sample - loss: 389.1576 - mae: 18.8246 - val_loss: 348.7122 - val_mae: 17.8704 Epoch 6/200 301/301 [==============================] - 0s 121us/sample - loss: 343.2724 - mae: 17.6712 - val_loss: 304.4277 - val_mae: 16.6519 Epoch 7/200 301/301 [==============================] - 0s 121us/sample - loss: 297.4918 - mae: 16.4010 - val_loss: 260.1293 - val_mae: 15.3034 Epoch 8/200 301/301 [==============================] - 0s 121us/sample - loss: 254.1379 - mae: 15.0668 - val_loss: 219.4432 - val_mae: 13.9234 Epoch 9/200 301/301 [==============================] - 0s 126us/sample - loss: 213.6236 - mae: 13.5971 - val_loss: 181.2664 - val_mae: 12.4448 Epoch 10/200 301/301 [==============================] - 0s 124us/sample - loss: 175.6165 - mae: 12.1861 - val_loss: 148.1455 - val_mae: 11.0818 Epoch 11/200 301/301 [==============================] - 0s 118us/sample - loss: 142.2538 - mae: 10.7807 - val_loss: 118.8379 - val_mae: 9.7693 Epoch 12/200 301/301 [==============================] - 0s 117us/sample - loss: 113.0447 - mae: 9.4255 - val_loss: 93.8548 - val_mae: 8.5505 Epoch 13/200 301/301 [==============================] - 0s 116us/sample - loss: 88.6557 - mae: 8.2059 - val_loss: 73.1364 - val_mae: 7.4429 Epoch 14/200 301/301 [==============================] - 0s 122us/sample - loss: 68.9015 - mae: 7.0436 - val_loss: 56.4206 - val_mae: 6.3962 Epoch 15/200 301/301 [==============================] - 0s 126us/sample - loss: 52.6410 - mae: 6.0077 - val_loss: 42.8257 - val_mae: 5.4321 Epoch 16/200 301/301 [==============================] - 0s 119us/sample - loss: 40.0751 - mae: 5.0953 - val_loss: 32.4977 - val_mae: 4.6265 Epoch 17/200 301/301 [==============================] - 0s 117us/sample - loss: 31.2453 - mae: 4.3658 - val_loss: 25.6672 - val_mae: 3.9566 Epoch 18/200 301/301 [==============================] - 0s 115us/sample - loss: 26.1461 - mae: 3.9489 - val_loss: 22.3788 - val_mae: 3.5354 Epoch 19/200 301/301 [==============================] - 0s 123us/sample - loss: 23.1535 - mae: 3.6598 - val_loss: 20.6886 - val_mae: 3.3122 Epoch 20/200 301/301 [==============================] - 0s 125us/sample - loss: 21.8175 - mae: 3.5240 - val_loss: 19.9896 - val_mae: 3.2262 Epoch 21/200 301/301 [==============================] - 0s 123us/sample - loss: 20.6327 - mae: 3.3916 - val_loss: 19.1224 - val_mae: 3.1414 Epoch 22/200 301/301 [==============================] - 0s 119us/sample - loss: 20.0315 - mae: 3.3328 - val_loss: 18.9058 - val_mae: 3.1250 Epoch 23/200 301/301 [==============================] - 0s 122us/sample - loss: 19.4205 - mae: 3.2486 - val_loss: 18.6090 - val_mae: 3.0961 Epoch 24/200 301/301 [==============================] - 0s 119us/sample - loss: 18.9064 - mae: 3.2088 - val_loss: 18.8968 - val_mae: 3.1592 Epoch 25/200 301/301 [==============================] - 0s 118us/sample - loss: 18.6846 - mae: 3.1737 - val_loss: 18.4443 - val_mae: 3.0908 Epoch 26/200 301/301 [==============================] - 0s 118us/sample - loss: 18.2510 - mae: 3.1352 - val_loss: 18.4734 - val_mae: 3.1084 Epoch 27/200 301/301 [==============================] - 0s 118us/sample - loss: 17.9960 - mae: 3.0879 - val_loss: 18.2973 - val_mae: 3.0842 Epoch 28/200 301/301 [==============================] - 0s 120us/sample - loss: 18.0459 - mae: 3.1352 - val_loss: 18.7694 - val_mae: 3.2219 Epoch 29/200 301/301 [==============================] - 0s 120us/sample - loss: 17.6292 - mae: 3.0559 - val_loss: 18.1039 - val_mae: 3.0818 Epoch 30/200 301/301 [==============================] - 0s 122us/sample - loss: 17.4054 - mae: 3.0189 - val_loss: 18.1852 - val_mae: 3.1241 Epoch 31/200 301/301 [==============================] - 0s 119us/sample - loss: 17.4243 - mae: 3.0377 - val_loss: 18.0259 - val_mae: 3.1188 Epoch 32/200 301/301 [==============================] - 0s 120us/sample - loss: 17.0192 - mae: 3.0047 - val_loss: 17.8698 - val_mae: 3.0729 Epoch 33/200 301/301 [==============================] - 0s 123us/sample - loss: 17.0480 - mae: 2.9900 - val_loss: 17.6402 - val_mae: 3.0746 Epoch 34/200 301/301 [==============================] - 0s 122us/sample - loss: 16.8351 - mae: 2.9754 - val_loss: 17.5187 - val_mae: 3.0314 Epoch 35/200 301/301 [==============================] - 0s 120us/sample - loss: 16.7391 - mae: 2.9495 - val_loss: 17.5626 - val_mae: 3.0453 Epoch 36/200 301/301 [==============================] - 0s 119us/sample - loss: 16.8009 - mae: 3.0098 - val_loss: 17.5724 - val_mae: 3.1038 Epoch 37/200 301/301 [==============================] - 0s 118us/sample - loss: 16.5445 - mae: 2.9627 - val_loss: 17.6090 - val_mae: 3.0947 Epoch 38/200 301/301 [==============================] - 0s 119us/sample - loss: 16.3383 - mae: 2.9318 - val_loss: 17.2221 - val_mae: 3.0121 Epoch 39/200 301/301 [==============================] - 0s 127us/sample - loss: 16.2438 - mae: 2.9190 - val_loss: 17.6036 - val_mae: 3.0718 Epoch 40/200 301/301 [==============================] - 0s 119us/sample - loss: 16.1884 - mae: 2.8944 - val_loss: 17.2065 - val_mae: 3.0184 Epoch 41/200 301/301 [==============================] - 0s 119us/sample - loss: 16.1722 - mae: 2.8919 - val_loss: 17.0714 - val_mae: 3.0003 Epoch 42/200 301/301 [==============================] - 0s 123us/sample - loss: 16.2878 - mae: 2.9099 - val_loss: 17.0668 - val_mae: 3.0282 Epoch 43/200 301/301 [==============================] - 0s 122us/sample - loss: 16.2676 - mae: 2.9199 - val_loss: 16.8528 - val_mae: 2.9913 Epoch 44/200 301/301 [==============================] - 0s 121us/sample - loss: 16.0382 - mae: 2.8601 - val_loss: 16.7030 - val_mae: 2.9250 Epoch 45/200 301/301 [==============================] - 0s 121us/sample - loss: 15.8211 - mae: 2.9046 - val_loss: 17.1765 - val_mae: 3.0742 Epoch 46/200 301/301 [==============================] - 0s 118us/sample - loss: 15.7512 - mae: 2.8677 - val_loss: 16.6193 - val_mae: 2.9470 Epoch 47/200 301/301 [==============================] - 0s 119us/sample - loss: 15.4855 - mae: 2.8228 - val_loss: 16.4922 - val_mae: 2.9303 Epoch 48/200 301/301 [==============================] - 0s 121us/sample - loss: 15.3933 - mae: 2.8077 - val_loss: 16.4596 - val_mae: 2.9618 Epoch 49/200 301/301 [==============================] - 0s 122us/sample - loss: 15.3376 - mae: 2.8250 - val_loss: 16.2519 - val_mae: 2.9322 Epoch 50/200 301/301 [==============================] - 0s 119us/sample - loss: 15.3005 - mae: 2.8401 - val_loss: 16.3322 - val_mae: 2.9316 Epoch 51/200 301/301 [==============================] - 0s 119us/sample - loss: 15.0724 - mae: 2.8227 - val_loss: 16.4440 - val_mae: 3.0176 Epoch 52/200 301/301 [==============================] - 0s 120us/sample - loss: 15.1293 - mae: 2.8132 - val_loss: 15.8492 - val_mae: 2.8736 Epoch 53/200 301/301 [==============================] - 0s 122us/sample - loss: 15.1044 - mae: 2.8137 - val_loss: 16.1823 - val_mae: 2.9608 Epoch 54/200 301/301 [==============================] - 0s 122us/sample - loss: 14.7850 - mae: 2.7958 - val_loss: 15.9114 - val_mae: 2.9044 Epoch 55/200 301/301 [==============================] - 0s 120us/sample - loss: 14.7106 - mae: 2.7521 - val_loss: 15.6763 - val_mae: 2.8213 Epoch 56/200 301/301 [==============================] - 0s 119us/sample - loss: 14.8919 - mae: 2.8032 - val_loss: 15.6125 - val_mae: 2.8711 Epoch 57/200 301/301 [==============================] - 0s 122us/sample - loss: 14.4630 - mae: 2.7556 - val_loss: 15.7038 - val_mae: 2.8874 Epoch 58/200 301/301 [==============================] - 0s 119us/sample - loss: 14.5041 - mae: 2.7368 - val_loss: 15.3749 - val_mae: 2.8179 Epoch 59/200 301/301 [==============================] - 0s 123us/sample - loss: 14.4597 - mae: 2.7211 - val_loss: 15.3728 - val_mae: 2.7548 Epoch 60/200 301/301 [==============================] - 0s 118us/sample - loss: 14.3991 - mae: 2.7459 - val_loss: 15.9141 - val_mae: 2.9592 Epoch 61/200 301/301 [==============================] - 0s 118us/sample - loss: 14.2507 - mae: 2.7070 - val_loss: 15.4694 - val_mae: 2.8761 Epoch 62/200 301/301 [==============================] - 0s 121us/sample - loss: 14.0435 - mae: 2.6795 - val_loss: 15.2149 - val_mae: 2.8378 Epoch 63/200 301/301 [==============================] - 0s 121us/sample - loss: 13.9765 - mae: 2.6980 - val_loss: 15.4159 - val_mae: 2.8608 Epoch 64/200 301/301 [==============================] - 0s 119us/sample - loss: 14.0912 - mae: 2.6859 - val_loss: 15.0283 - val_mae: 2.8004 Epoch 65/200 301/301 [==============================] - 0s 121us/sample - loss: 13.9014 - mae: 2.6690 - val_loss: 15.1429 - val_mae: 2.8507 Epoch 66/200 301/301 [==============================] - 0s 119us/sample - loss: 13.8437 - mae: 2.6599 - val_loss: 14.9920 - val_mae: 2.8314 Epoch 67/200 301/301 [==============================] - 0s 123us/sample - loss: 13.7034 - mae: 2.6738 - val_loss: 15.5377 - val_mae: 2.9422 Epoch 68/200 301/301 [==============================] - 0s 120us/sample - loss: 13.6395 - mae: 2.6446 - val_loss: 14.7097 - val_mae: 2.7631 Epoch 69/200 301/301 [==============================] - 0s 122us/sample - loss: 13.5013 - mae: 2.6324 - val_loss: 14.7221 - val_mae: 2.7953 Epoch 70/200 301/301 [==============================] - 0s 119us/sample - loss: 13.4306 - mae: 2.6309 - val_loss: 14.5965 - val_mae: 2.7040 Epoch 71/200 301/301 [==============================] - 0s 122us/sample - loss: 13.5187 - mae: 2.6619 - val_loss: 14.6088 - val_mae: 2.7902 Epoch 72/200 301/301 [==============================] - 0s 121us/sample - loss: 13.3725 - mae: 2.6021 - val_loss: 14.5294 - val_mae: 2.7531 Epoch 73/200 301/301 [==============================] - 0s 119us/sample - loss: 13.2178 - mae: 2.6073 - val_loss: 14.5177 - val_mae: 2.7736 Epoch 74/200 301/301 [==============================] - 0s 120us/sample - loss: 13.3350 - mae: 2.6166 - val_loss: 14.8652 - val_mae: 2.8406 Epoch 75/200 301/301 [==============================] - 0s 117us/sample - loss: 12.9513 - mae: 2.5734 - val_loss: 14.3199 - val_mae: 2.7254 Epoch 76/200 301/301 [==============================] - 0s 116us/sample - loss: 13.2311 - mae: 2.6074 - val_loss: 14.2985 - val_mae: 2.7357 Epoch 77/200 301/301 [==============================] - 0s 121us/sample - loss: 12.9445 - mae: 2.5782 - val_loss: 14.2434 - val_mae: 2.7091 Epoch 78/200 301/301 [==============================] - 0s 127us/sample - loss: 13.0607 - mae: 2.5763 - val_loss: 14.3330 - val_mae: 2.7785 Epoch 79/200 301/301 [==============================] - 0s 122us/sample - loss: 12.8531 - mae: 2.5929 - val_loss: 14.2748 - val_mae: 2.7763 Epoch 80/200 301/301 [==============================] - 0s 117us/sample - loss: 12.6902 - mae: 2.5621 - val_loss: 14.3333 - val_mae: 2.7823 Epoch 81/200 301/301 [==============================] - 0s 120us/sample - loss: 12.7782 - mae: 2.5626 - val_loss: 14.0810 - val_mae: 2.7382 Epoch 82/200 301/301 [==============================] - 0s 120us/sample - loss: 12.7910 - mae: 2.5978 - val_loss: 14.0860 - val_mae: 2.7580 Epoch 83/200 301/301 [==============================] - 0s 120us/sample - loss: 12.5182 - mae: 2.5433 - val_loss: 13.9590 - val_mae: 2.7078 Epoch 84/200 301/301 [==============================] - 0s 122us/sample - loss: 12.5951 - mae: 2.5409 - val_loss: 14.5747 - val_mae: 2.8195 Epoch 85/200 301/301 [==============================] - 0s 118us/sample - loss: 12.6349 - mae: 2.5433 - val_loss: 13.8372 - val_mae: 2.6824 Epoch 86/200 301/301 [==============================] - 0s 118us/sample - loss: 12.5010 - mae: 2.5597 - val_loss: 13.9278 - val_mae: 2.7312 Epoch 87/200 301/301 [==============================] - 0s 122us/sample - loss: 12.4399 - mae: 2.5397 - val_loss: 13.9742 - val_mae: 2.7337 Epoch 88/200 301/301 [==============================] - 0s 125us/sample - loss: 12.3800 - mae: 2.5450 - val_loss: 13.8397 - val_mae: 2.7039 Epoch 89/200 301/301 [==============================] - 0s 120us/sample - loss: 12.3164 - mae: 2.5249 - val_loss: 13.5238 - val_mae: 2.6356 Epoch 90/200 301/301 [==============================] - 0s 119us/sample - loss: 12.3961 - mae: 2.5523 - val_loss: 13.6235 - val_mae: 2.6815 Epoch 91/200 301/301 [==============================] - 0s 121us/sample - loss: 12.3466 - mae: 2.5430 - val_loss: 13.8310 - val_mae: 2.7157 Epoch 92/200 301/301 [==============================] - 0s 119us/sample - loss: 12.1178 - mae: 2.5201 - val_loss: 13.7144 - val_mae: 2.6847 Epoch 93/200 301/301 [==============================] - 0s 119us/sample - loss: 12.0391 - mae: 2.5195 - val_loss: 13.7231 - val_mae: 2.7073 Epoch 94/200 301/301 [==============================] - 0s 120us/sample - loss: 12.0305 - mae: 2.5021 - val_loss: 13.8125 - val_mae: 2.7169 Epoch 95/200 301/301 [==============================] - 0s 118us/sample - loss: 12.1478 - mae: 2.5176 - val_loss: 13.6076 - val_mae: 2.6806 Epoch 96/200 301/301 [==============================] - 0s 121us/sample - loss: 12.0361 - mae: 2.5242 - val_loss: 13.3146 - val_mae: 2.6572 Epoch 97/200 301/301 [==============================] - 0s 122us/sample - loss: 11.9712 - mae: 2.4965 - val_loss: 13.4656 - val_mae: 2.6761 Epoch 98/200 301/301 [==============================] - 0s 122us/sample - loss: 11.8858 - mae: 2.5005 - val_loss: 13.5303 - val_mae: 2.6862 Epoch 99/200 301/301 [==============================] - 0s 121us/sample - loss: 11.8899 - mae: 2.5051 - val_loss: 13.7696 - val_mae: 2.7174 Epoch 100/200 301/301 [==============================] - 0s 119us/sample - loss: 11.8758 - mae: 2.4752 - val_loss: 13.3627 - val_mae: 2.6464 Epoch 101/200 301/301 [==============================] - 0s 121us/sample - loss: 11.8443 - mae: 2.4826 - val_loss: 13.3772 - val_mae: 2.6457 Epoch 102/200 301/301 [==============================] - 0s 125us/sample - loss: 11.7918 - mae: 2.4906 - val_loss: 13.7830 - val_mae: 2.7131 Epoch 103/200 301/301 [==============================] - 0s 122us/sample - loss: 11.7359 - mae: 2.4708 - val_loss: 13.2770 - val_mae: 2.6178 Epoch 104/200 301/301 [==============================] - 0s 119us/sample - loss: 11.9112 - mae: 2.4997 - val_loss: 13.3173 - val_mae: 2.6414 Epoch 105/200 301/301 [==============================] - 0s 118us/sample - loss: 11.7109 - mae: 2.4468 - val_loss: 13.1929 - val_mae: 2.6110 Epoch 106/200 301/301 [==============================] - 0s 119us/sample - loss: 11.9026 - mae: 2.4676 - val_loss: 13.5134 - val_mae: 2.6756 Epoch 107/200 301/301 [==============================] - 0s 125us/sample - loss: 11.6198 - mae: 2.4745 - val_loss: 13.2656 - val_mae: 2.6479 Epoch 108/200 301/301 [==============================] - 0s 123us/sample - loss: 11.7253 - mae: 2.4716 - val_loss: 13.3166 - val_mae: 2.6470 Epoch 109/200 301/301 [==============================] - 0s 120us/sample - loss: 11.5632 - mae: 2.4583 - val_loss: 13.1774 - val_mae: 2.6206 Epoch 110/200 301/301 [==============================] - 0s 121us/sample - loss: 11.6422 - mae: 2.4616 - val_loss: 14.0181 - val_mae: 2.7405 Epoch 111/200 301/301 [==============================] - 0s 125us/sample - loss: 11.6338 - mae: 2.4652 - val_loss: 13.2938 - val_mae: 2.6475 Epoch 112/200 301/301 [==============================] - 0s 123us/sample - loss: 11.5721 - mae: 2.4644 - val_loss: 13.1084 - val_mae: 2.6256 Epoch 113/200 301/301 [==============================] - 0s 120us/sample - loss: 11.5150 - mae: 2.4432 - val_loss: 13.2181 - val_mae: 2.6398 Epoch 114/200 301/301 [==============================] - 0s 120us/sample - loss: 11.4629 - mae: 2.4620 - val_loss: 13.9348 - val_mae: 2.7428 Epoch 115/200 301/301 [==============================] - 0s 128us/sample - loss: 11.7967 - mae: 2.4777 - val_loss: 13.0141 - val_mae: 2.5978 Epoch 116/200 301/301 [==============================] - 0s 128us/sample - loss: 11.3917 - mae: 2.4401 - val_loss: 13.3034 - val_mae: 2.6604 Epoch 117/200 301/301 [==============================] - 0s 122us/sample - loss: 11.3969 - mae: 2.4342 - val_loss: 13.0744 - val_mae: 2.6197 Epoch 118/200 301/301 [==============================] - 0s 123us/sample - loss: 11.4892 - mae: 2.4623 - val_loss: 13.5645 - val_mae: 2.7037 Epoch 119/200 301/301 [==============================] - 0s 124us/sample - loss: 11.3178 - mae: 2.4319 - val_loss: 12.9279 - val_mae: 2.5897 Epoch 120/200 301/301 [==============================] - 0s 124us/sample - loss: 11.3984 - mae: 2.4323 - val_loss: 12.9735 - val_mae: 2.6093 Epoch 121/200 301/301 [==============================] - 0s 122us/sample - loss: 11.2923 - mae: 2.4281 - val_loss: 12.8791 - val_mae: 2.5819 Epoch 122/200 301/301 [==============================] - 0s 117us/sample - loss: 11.2403 - mae: 2.4206 - val_loss: 12.7977 - val_mae: 2.5703 Epoch 123/200 301/301 [==============================] - 0s 125us/sample - loss: 11.4483 - mae: 2.4680 - val_loss: 13.3020 - val_mae: 2.6452 Epoch 124/200 301/301 [==============================] - 0s 120us/sample - loss: 11.2414 - mae: 2.4102 - val_loss: 12.7987 - val_mae: 2.5658 Epoch 125/200 301/301 [==============================] - 0s 117us/sample - loss: 11.3345 - mae: 2.4282 - val_loss: 13.0715 - val_mae: 2.6156 Epoch 126/200 301/301 [==============================] - 0s 120us/sample - loss: 11.2589 - mae: 2.4008 - val_loss: 12.9738 - val_mae: 2.6026 Epoch 127/200 301/301 [==============================] - 0s 124us/sample - loss: 11.2661 - mae: 2.4358 - val_loss: 12.9976 - val_mae: 2.6034 Epoch 128/200 301/301 [==============================] - 0s 124us/sample - loss: 11.1842 - mae: 2.4117 - val_loss: 12.8794 - val_mae: 2.6070 Epoch 129/200 301/301 [==============================] - 0s 123us/sample - loss: 11.4340 - mae: 2.4545 - val_loss: 12.8239 - val_mae: 2.5930 Epoch 130/200 301/301 [==============================] - 0s 122us/sample - loss: 11.1233 - mae: 2.4010 - val_loss: 13.2496 - val_mae: 2.6351 Epoch 131/200 301/301 [==============================] - 0s 125us/sample - loss: 11.2966 - mae: 2.4253 - val_loss: 13.3573 - val_mae: 2.6497 Epoch 132/200 301/301 [==============================] - 0s 121us/sample - loss: 11.2028 - mae: 2.4216 - val_loss: 12.8889 - val_mae: 2.5814 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 38us/sample - loss: 8.4723 - mae: 2.5193 [CV] END learning_rate=0.002049356284410796, n_hidden=1, n_neurons=13; total time= 5.4s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 572.9176 - mae: 22.0544 - val_loss: 542.8304 - val_mae: 21.6978 Epoch 2/200 302/302 [==============================] - 0s 120us/sample - loss: 533.3589 - mae: 21.5561 - val_loss: 506.1673 - val_mae: 21.1383 Epoch 3/200 302/302 [==============================] - 0s 126us/sample - loss: 496.0070 - mae: 20.9152 - val_loss: 466.7287 - val_mae: 20.4147 Epoch 4/200 302/302 [==============================] - 0s 123us/sample - loss: 454.7075 - mae: 20.1482 - val_loss: 422.1894 - val_mae: 19.5075 Epoch 5/200 302/302 [==============================] - 0s 127us/sample - loss: 410.2483 - mae: 19.1971 - val_loss: 375.6403 - val_mae: 18.4137 Epoch 6/200 302/302 [==============================] - 0s 128us/sample - loss: 361.7118 - mae: 18.0334 - val_loss: 326.2551 - val_mae: 17.1282 Epoch 7/200 302/302 [==============================] - 0s 124us/sample - loss: 311.2838 - mae: 16.6745 - val_loss: 275.4307 - val_mae: 15.6663 Epoch 8/200 302/302 [==============================] - 0s 124us/sample - loss: 260.8565 - mae: 15.2036 - val_loss: 226.7941 - val_mae: 14.1162 Epoch 9/200 302/302 [==============================] - 0s 127us/sample - loss: 212.9068 - mae: 13.6147 - val_loss: 182.4352 - val_mae: 12.5165 Epoch 10/200 302/302 [==============================] - 0s 122us/sample - loss: 169.7123 - mae: 11.9711 - val_loss: 142.6582 - val_mae: 10.8793 Epoch 11/200 302/302 [==============================] - 0s 121us/sample - loss: 131.9615 - mae: 10.3380 - val_loss: 109.5412 - val_mae: 9.2983 Epoch 12/200 302/302 [==============================] - 0s 121us/sample - loss: 100.6319 - mae: 8.7680 - val_loss: 81.9307 - val_mae: 7.8178 Epoch 13/200 302/302 [==============================] - 0s 129us/sample - loss: 74.5670 - mae: 7.3448 - val_loss: 59.8764 - val_mae: 6.5039 Epoch 14/200 302/302 [==============================] - 0s 123us/sample - loss: 54.3524 - mae: 6.0607 - val_loss: 43.6733 - val_mae: 5.2907 Epoch 15/200 302/302 [==============================] - 0s 122us/sample - loss: 39.6801 - mae: 4.9529 - val_loss: 32.7570 - val_mae: 4.3178 Epoch 16/200 302/302 [==============================] - 0s 127us/sample - loss: 29.8664 - mae: 4.1207 - val_loss: 26.2259 - val_mae: 3.7243 Epoch 17/200 302/302 [==============================] - 0s 120us/sample - loss: 24.1294 - mae: 3.6147 - val_loss: 23.4154 - val_mae: 3.5024 Epoch 18/200 302/302 [==============================] - 0s 121us/sample - loss: 21.3047 - mae: 3.3928 - val_loss: 22.5393 - val_mae: 3.4851 Epoch 19/200 302/302 [==============================] - 0s 123us/sample - loss: 20.2838 - mae: 3.3250 - val_loss: 22.1127 - val_mae: 3.5061 Epoch 20/200 302/302 [==============================] - 0s 119us/sample - loss: 19.7178 - mae: 3.2850 - val_loss: 21.3624 - val_mae: 3.4301 Epoch 21/200 302/302 [==============================] - 0s 124us/sample - loss: 19.1476 - mae: 3.2376 - val_loss: 21.1963 - val_mae: 3.4371 Epoch 22/200 302/302 [==============================] - 0s 119us/sample - loss: 18.6748 - mae: 3.1999 - val_loss: 20.7434 - val_mae: 3.3823 Epoch 23/200 302/302 [==============================] - 0s 120us/sample - loss: 18.4742 - mae: 3.1904 - val_loss: 20.5722 - val_mae: 3.3939 Epoch 24/200 302/302 [==============================] - 0s 123us/sample - loss: 17.9536 - mae: 3.1426 - val_loss: 21.2156 - val_mae: 3.4721 Epoch 25/200 302/302 [==============================] - 0s 123us/sample - loss: 17.9419 - mae: 3.0868 - val_loss: 20.3609 - val_mae: 3.3518 Epoch 26/200 302/302 [==============================] - 0s 121us/sample - loss: 17.4349 - mae: 3.0756 - val_loss: 20.5643 - val_mae: 3.3999 Epoch 27/200 302/302 [==============================] - 0s 117us/sample - loss: 17.3645 - mae: 3.0971 - val_loss: 20.5233 - val_mae: 3.4314 Epoch 28/200 302/302 [==============================] - 0s 122us/sample - loss: 17.1602 - mae: 3.0593 - val_loss: 20.3642 - val_mae: 3.3913 Epoch 29/200 302/302 [==============================] - 0s 121us/sample - loss: 16.9656 - mae: 3.0261 - val_loss: 20.0258 - val_mae: 3.3504 Epoch 30/200 302/302 [==============================] - 0s 119us/sample - loss: 16.9113 - mae: 3.0320 - val_loss: 19.8230 - val_mae: 3.3445 Epoch 31/200 302/302 [==============================] - 0s 120us/sample - loss: 16.5702 - mae: 3.0322 - val_loss: 20.1717 - val_mae: 3.3892 Epoch 32/200 302/302 [==============================] - 0s 122us/sample - loss: 16.8332 - mae: 2.9970 - val_loss: 19.9879 - val_mae: 3.3624 Epoch 33/200 302/302 [==============================] - 0s 123us/sample - loss: 16.3787 - mae: 2.9430 - val_loss: 19.4661 - val_mae: 3.2525 Epoch 34/200 302/302 [==============================] - 0s 127us/sample - loss: 16.3310 - mae: 2.9717 - val_loss: 19.9853 - val_mae: 3.3886 Epoch 35/200 302/302 [==============================] - 0s 126us/sample - loss: 16.2333 - mae: 2.9630 - val_loss: 19.5330 - val_mae: 3.3202 Epoch 36/200 302/302 [==============================] - 0s 126us/sample - loss: 16.1501 - mae: 2.9315 - val_loss: 19.1039 - val_mae: 3.2446 Epoch 37/200 302/302 [==============================] - 0s 123us/sample - loss: 16.2983 - mae: 2.9875 - val_loss: 19.2138 - val_mae: 3.3031 Epoch 38/200 302/302 [==============================] - 0s 124us/sample - loss: 16.0274 - mae: 2.9131 - val_loss: 18.9146 - val_mae: 3.2591 Epoch 39/200 302/302 [==============================] - 0s 124us/sample - loss: 15.8468 - mae: 2.8972 - val_loss: 18.5524 - val_mae: 3.2187 Epoch 40/200 302/302 [==============================] - 0s 124us/sample - loss: 16.0111 - mae: 2.9475 - val_loss: 18.6288 - val_mae: 3.2076 Epoch 41/200 302/302 [==============================] - 0s 132us/sample - loss: 15.7556 - mae: 2.9198 - val_loss: 18.7076 - val_mae: 3.2526 Epoch 42/200 302/302 [==============================] - 0s 124us/sample - loss: 15.6321 - mae: 2.9045 - val_loss: 18.6639 - val_mae: 3.2655 Epoch 43/200 302/302 [==============================] - 0s 133us/sample - loss: 15.4928 - mae: 2.8858 - val_loss: 18.7127 - val_mae: 3.2825 Epoch 44/200 302/302 [==============================] - 0s 130us/sample - loss: 15.4967 - mae: 2.8869 - val_loss: 18.2495 - val_mae: 3.2056 Epoch 45/200 302/302 [==============================] - 0s 127us/sample - loss: 15.5069 - mae: 2.8901 - val_loss: 18.0660 - val_mae: 3.1876 Epoch 46/200 302/302 [==============================] - 0s 128us/sample - loss: 15.1816 - mae: 2.8618 - val_loss: 18.3410 - val_mae: 3.2344 Epoch 47/200 302/302 [==============================] - 0s 120us/sample - loss: 15.2943 - mae: 2.8731 - val_loss: 18.2576 - val_mae: 3.2124 Epoch 48/200 302/302 [==============================] - 0s 123us/sample - loss: 15.1366 - mae: 2.8199 - val_loss: 17.7677 - val_mae: 3.1510 Epoch 49/200 302/302 [==============================] - 0s 128us/sample - loss: 15.1047 - mae: 2.8384 - val_loss: 17.7883 - val_mae: 3.1690 Epoch 50/200 302/302 [==============================] - 0s 133us/sample - loss: 14.8565 - mae: 2.8103 - val_loss: 17.6275 - val_mae: 3.1598 Epoch 51/200 302/302 [==============================] - 0s 126us/sample - loss: 14.7929 - mae: 2.8285 - val_loss: 17.5790 - val_mae: 3.1450 Epoch 52/200 302/302 [==============================] - 0s 132us/sample - loss: 14.8949 - mae: 2.8264 - val_loss: 17.3737 - val_mae: 3.1041 Epoch 53/200 302/302 [==============================] - 0s 127us/sample - loss: 14.7510 - mae: 2.7967 - val_loss: 17.4967 - val_mae: 3.1325 Epoch 54/200 302/302 [==============================] - 0s 132us/sample - loss: 14.6100 - mae: 2.7758 - val_loss: 17.5898 - val_mae: 3.1367 Epoch 55/200 302/302 [==============================] - 0s 131us/sample - loss: 14.6417 - mae: 2.8001 - val_loss: 17.4347 - val_mae: 3.1289 Epoch 56/200 302/302 [==============================] - 0s 121us/sample - loss: 14.5856 - mae: 2.7860 - val_loss: 17.1681 - val_mae: 3.0830 Epoch 57/200 302/302 [==============================] - 0s 123us/sample - loss: 14.4526 - mae: 2.7774 - val_loss: 16.8714 - val_mae: 3.0619 Epoch 58/200 302/302 [==============================] - 0s 128us/sample - loss: 14.2708 - mae: 2.7527 - val_loss: 16.9701 - val_mae: 3.0699 Epoch 59/200 302/302 [==============================] - 0s 124us/sample - loss: 14.3481 - mae: 2.7827 - val_loss: 16.8789 - val_mae: 3.0661 Epoch 60/200 302/302 [==============================] - 0s 124us/sample - loss: 14.1608 - mae: 2.7519 - val_loss: 16.7331 - val_mae: 3.0542 Epoch 61/200 302/302 [==============================] - 0s 128us/sample - loss: 14.3576 - mae: 2.7558 - val_loss: 16.3559 - val_mae: 2.9892 Epoch 62/200 302/302 [==============================] - 0s 127us/sample - loss: 14.1114 - mae: 2.7458 - val_loss: 16.5790 - val_mae: 3.0509 Epoch 63/200 302/302 [==============================] - 0s 121us/sample - loss: 14.1074 - mae: 2.7658 - val_loss: 16.7965 - val_mae: 3.0593 Epoch 64/200 302/302 [==============================] - 0s 127us/sample - loss: 13.9692 - mae: 2.7197 - val_loss: 16.3019 - val_mae: 2.9824 Epoch 65/200 302/302 [==============================] - 0s 119us/sample - loss: 14.0486 - mae: 2.7304 - val_loss: 16.4819 - val_mae: 3.0237 Epoch 66/200 302/302 [==============================] - 0s 121us/sample - loss: 13.7914 - mae: 2.7020 - val_loss: 16.3783 - val_mae: 3.0214 Epoch 67/200 302/302 [==============================] - 0s 121us/sample - loss: 13.8615 - mae: 2.7549 - val_loss: 16.1976 - val_mae: 2.9899 Epoch 68/200 302/302 [==============================] - 0s 125us/sample - loss: 13.6962 - mae: 2.6885 - val_loss: 16.0656 - val_mae: 2.9499 Epoch 69/200 302/302 [==============================] - 0s 123us/sample - loss: 13.7559 - mae: 2.7172 - val_loss: 16.1053 - val_mae: 2.9910 Epoch 70/200 302/302 [==============================] - 0s 120us/sample - loss: 13.6192 - mae: 2.7190 - val_loss: 16.3688 - val_mae: 3.0279 Epoch 71/200 302/302 [==============================] - 0s 120us/sample - loss: 13.6960 - mae: 2.6967 - val_loss: 16.1630 - val_mae: 3.0041 Epoch 72/200 302/302 [==============================] - 0s 123us/sample - loss: 13.3807 - mae: 2.6729 - val_loss: 16.1584 - val_mae: 3.0111 Epoch 73/200 302/302 [==============================] - 0s 125us/sample - loss: 13.3847 - mae: 2.6410 - val_loss: 15.6602 - val_mae: 2.9069 Epoch 74/200 302/302 [==============================] - 0s 121us/sample - loss: 13.3555 - mae: 2.6576 - val_loss: 15.6318 - val_mae: 2.8992 Epoch 75/200 302/302 [==============================] - 0s 119us/sample - loss: 13.2757 - mae: 2.6801 - val_loss: 15.7176 - val_mae: 2.9366 Epoch 76/200 302/302 [==============================] - 0s 120us/sample - loss: 13.1873 - mae: 2.6658 - val_loss: 16.1082 - val_mae: 3.0013 Epoch 77/200 302/302 [==============================] - 0s 127us/sample - loss: 13.0888 - mae: 2.6123 - val_loss: 15.6095 - val_mae: 2.8943 Epoch 78/200 302/302 [==============================] - 0s 118us/sample - loss: 13.1200 - mae: 2.6994 - val_loss: 15.9765 - val_mae: 2.9870 Epoch 79/200 302/302 [==============================] - 0s 119us/sample - loss: 13.0806 - mae: 2.6684 - val_loss: 15.4850 - val_mae: 2.9209 Epoch 80/200 302/302 [==============================] - 0s 119us/sample - loss: 12.8030 - mae: 2.6227 - val_loss: 15.8826 - val_mae: 2.9764 Epoch 81/200 302/302 [==============================] - 0s 121us/sample - loss: 12.9373 - mae: 2.6304 - val_loss: 15.6523 - val_mae: 2.9448 Epoch 82/200 302/302 [==============================] - 0s 123us/sample - loss: 12.7927 - mae: 2.6077 - val_loss: 15.1913 - val_mae: 2.8715 Epoch 83/200 302/302 [==============================] - 0s 120us/sample - loss: 12.8435 - mae: 2.6270 - val_loss: 15.0944 - val_mae: 2.8629 Epoch 84/200 302/302 [==============================] - 0s 120us/sample - loss: 12.6306 - mae: 2.6276 - val_loss: 15.0044 - val_mae: 2.8605 Epoch 85/200 302/302 [==============================] - 0s 123us/sample - loss: 12.7465 - mae: 2.6127 - val_loss: 15.0981 - val_mae: 2.8796 Epoch 86/200 302/302 [==============================] - 0s 125us/sample - loss: 12.5546 - mae: 2.5838 - val_loss: 14.8746 - val_mae: 2.8498 Epoch 87/200 302/302 [==============================] - 0s 119us/sample - loss: 12.5004 - mae: 2.6109 - val_loss: 15.0561 - val_mae: 2.8844 Epoch 88/200 302/302 [==============================] - 0s 119us/sample - loss: 12.6582 - mae: 2.6326 - val_loss: 15.7517 - val_mae: 2.9701 Epoch 89/200 302/302 [==============================] - 0s 120us/sample - loss: 12.4914 - mae: 2.6013 - val_loss: 15.0254 - val_mae: 2.8772 Epoch 90/200 302/302 [==============================] - 0s 122us/sample - loss: 12.5096 - mae: 2.5739 - val_loss: 14.7451 - val_mae: 2.8158 Epoch 91/200 302/302 [==============================] - 0s 123us/sample - loss: 12.3979 - mae: 2.6332 - val_loss: 15.4570 - val_mae: 2.9430 Epoch 92/200 302/302 [==============================] - 0s 119us/sample - loss: 12.5604 - mae: 2.5800 - val_loss: 15.3824 - val_mae: 2.9251 Epoch 93/200 302/302 [==============================] - 0s 118us/sample - loss: 12.3012 - mae: 2.5827 - val_loss: 14.8979 - val_mae: 2.8776 Epoch 94/200 302/302 [==============================] - 0s 121us/sample - loss: 12.3470 - mae: 2.5835 - val_loss: 14.7949 - val_mae: 2.8548 Epoch 95/200 302/302 [==============================] - 0s 126us/sample - loss: 12.2479 - mae: 2.5786 - val_loss: 14.7470 - val_mae: 2.8358 Epoch 96/200 302/302 [==============================] - 0s 120us/sample - loss: 12.3009 - mae: 2.5638 - val_loss: 14.5018 - val_mae: 2.7979 Epoch 97/200 302/302 [==============================] - 0s 118us/sample - loss: 12.0994 - mae: 2.5430 - val_loss: 14.8555 - val_mae: 2.8574 Epoch 98/200 302/302 [==============================] - 0s 116us/sample - loss: 12.0159 - mae: 2.5306 - val_loss: 14.5139 - val_mae: 2.8165 Epoch 99/200 302/302 [==============================] - 0s 115us/sample - loss: 11.9780 - mae: 2.5215 - val_loss: 14.7761 - val_mae: 2.8587 Epoch 100/200 302/302 [==============================] - 0s 116us/sample - loss: 11.9239 - mae: 2.5283 - val_loss: 14.8529 - val_mae: 2.8722 Epoch 101/200 302/302 [==============================] - 0s 119us/sample - loss: 12.0953 - mae: 2.5285 - val_loss: 14.5265 - val_mae: 2.8247 Epoch 102/200 302/302 [==============================] - 0s 117us/sample - loss: 12.0564 - mae: 2.5516 - val_loss: 14.8992 - val_mae: 2.8794 Epoch 103/200 302/302 [==============================] - 0s 119us/sample - loss: 11.8837 - mae: 2.5217 - val_loss: 14.4306 - val_mae: 2.8074 Epoch 104/200 302/302 [==============================] - 0s 116us/sample - loss: 11.9428 - mae: 2.5880 - val_loss: 14.6259 - val_mae: 2.8533 Epoch 105/200 302/302 [==============================] - 0s 114us/sample - loss: 11.7155 - mae: 2.5130 - val_loss: 14.1103 - val_mae: 2.7683 Epoch 106/200 302/302 [==============================] - 0s 118us/sample - loss: 11.6829 - mae: 2.5220 - val_loss: 14.4214 - val_mae: 2.8126 Epoch 107/200 302/302 [==============================] - 0s 120us/sample - loss: 11.6849 - mae: 2.4861 - val_loss: 13.9743 - val_mae: 2.7419 Epoch 108/200 302/302 [==============================] - 0s 117us/sample - loss: 11.7426 - mae: 2.5223 - val_loss: 14.7730 - val_mae: 2.8699 Epoch 109/200 302/302 [==============================] - 0s 118us/sample - loss: 12.0113 - mae: 2.5228 - val_loss: 13.9629 - val_mae: 2.7098 Epoch 110/200 302/302 [==============================] - 0s 118us/sample - loss: 11.7967 - mae: 2.5308 - val_loss: 14.0099 - val_mae: 2.7771 Epoch 111/200 302/302 [==============================] - 0s 115us/sample - loss: 11.6694 - mae: 2.5153 - val_loss: 13.6915 - val_mae: 2.7021 Epoch 112/200 302/302 [==============================] - 0s 117us/sample - loss: 11.6811 - mae: 2.5100 - val_loss: 14.3596 - val_mae: 2.8159 Epoch 113/200 302/302 [==============================] - 0s 117us/sample - loss: 11.4263 - mae: 2.4731 - val_loss: 13.8851 - val_mae: 2.7377 Epoch 114/200 302/302 [==============================] - 0s 115us/sample - loss: 11.3380 - mae: 2.4660 - val_loss: 14.2465 - val_mae: 2.8081 Epoch 115/200 302/302 [==============================] - 0s 119us/sample - loss: 11.4154 - mae: 2.4865 - val_loss: 13.8225 - val_mae: 2.7535 Epoch 116/200 302/302 [==============================] - 0s 118us/sample - loss: 11.4434 - mae: 2.4731 - val_loss: 13.7472 - val_mae: 2.7370 Epoch 117/200 302/302 [==============================] - 0s 119us/sample - loss: 11.5614 - mae: 2.4946 - val_loss: 13.7865 - val_mae: 2.7394 Epoch 118/200 302/302 [==============================] - 0s 118us/sample - loss: 11.2730 - mae: 2.4585 - val_loss: 13.8867 - val_mae: 2.7615 Epoch 119/200 302/302 [==============================] - 0s 119us/sample - loss: 11.4342 - mae: 2.4751 - val_loss: 14.1786 - val_mae: 2.7944 Epoch 120/200 302/302 [==============================] - 0s 124us/sample - loss: 11.1733 - mae: 2.4447 - val_loss: 13.5610 - val_mae: 2.7135 Epoch 121/200 302/302 [==============================] - 0s 122us/sample - loss: 11.2113 - mae: 2.4648 - val_loss: 13.5388 - val_mae: 2.7139 Epoch 122/200 302/302 [==============================] - 0s 118us/sample - loss: 11.2780 - mae: 2.4563 - val_loss: 13.8796 - val_mae: 2.7500 Epoch 123/200 302/302 [==============================] - 0s 120us/sample - loss: 11.2962 - mae: 2.4473 - val_loss: 13.7442 - val_mae: 2.7445 Epoch 124/200 302/302 [==============================] - 0s 122us/sample - loss: 10.9343 - mae: 2.4233 - val_loss: 13.6217 - val_mae: 2.6812 Epoch 125/200 302/302 [==============================] - 0s 124us/sample - loss: 11.1418 - mae: 2.4844 - val_loss: 14.3408 - val_mae: 2.8072 Epoch 126/200 302/302 [==============================] - 0s 120us/sample - loss: 11.0794 - mae: 2.4376 - val_loss: 13.3828 - val_mae: 2.6632 Epoch 127/200 302/302 [==============================] - 0s 118us/sample - loss: 11.1466 - mae: 2.4597 - val_loss: 13.3516 - val_mae: 2.6875 Epoch 128/200 302/302 [==============================] - 0s 121us/sample - loss: 11.1011 - mae: 2.4367 - val_loss: 13.4265 - val_mae: 2.6975 Epoch 129/200 302/302 [==============================] - 0s 118us/sample - loss: 10.9499 - mae: 2.4353 - val_loss: 13.6531 - val_mae: 2.7421 Epoch 130/200 302/302 [==============================] - 0s 123us/sample - loss: 11.1256 - mae: 2.4671 - val_loss: 13.7000 - val_mae: 2.7443 Epoch 131/200 302/302 [==============================] - 0s 119us/sample - loss: 11.0147 - mae: 2.4452 - val_loss: 13.3520 - val_mae: 2.6796 Epoch 132/200 302/302 [==============================] - 0s 115us/sample - loss: 10.9204 - mae: 2.4231 - val_loss: 13.8848 - val_mae: 2.7406 Epoch 133/200 302/302 [==============================] - 0s 123us/sample - loss: 10.9898 - mae: 2.4286 - val_loss: 13.3986 - val_mae: 2.6838 Epoch 134/200 302/302 [==============================] - 0s 121us/sample - loss: 10.7339 - mae: 2.4015 - val_loss: 13.2449 - val_mae: 2.6626 Epoch 135/200 302/302 [==============================] - 0s 122us/sample - loss: 10.9066 - mae: 2.4430 - val_loss: 13.1493 - val_mae: 2.6434 Epoch 136/200 302/302 [==============================] - 0s 120us/sample - loss: 10.6812 - mae: 2.4145 - val_loss: 14.0592 - val_mae: 2.7539 Epoch 137/200 302/302 [==============================] - 0s 119us/sample - loss: 10.9264 - mae: 2.4093 - val_loss: 13.1245 - val_mae: 2.6210 Epoch 138/200 302/302 [==============================] - 0s 119us/sample - loss: 10.7271 - mae: 2.4091 - val_loss: 13.5475 - val_mae: 2.7159 Epoch 139/200 302/302 [==============================] - 0s 119us/sample - loss: 10.7566 - mae: 2.4150 - val_loss: 13.1144 - val_mae: 2.6410 Epoch 140/200 302/302 [==============================] - 0s 118us/sample - loss: 10.8981 - mae: 2.4225 - val_loss: 13.4716 - val_mae: 2.6838 Epoch 141/200 302/302 [==============================] - 0s 119us/sample - loss: 10.6754 - mae: 2.3845 - val_loss: 13.1861 - val_mae: 2.6499 Epoch 142/200 302/302 [==============================] - 0s 118us/sample - loss: 10.7919 - mae: 2.4070 - val_loss: 13.1022 - val_mae: 2.6368 Epoch 143/200 302/302 [==============================] - 0s 119us/sample - loss: 10.6949 - mae: 2.4011 - val_loss: 13.3254 - val_mae: 2.6664 Epoch 144/200 302/302 [==============================] - 0s 123us/sample - loss: 10.5583 - mae: 2.3701 - val_loss: 13.1309 - val_mae: 2.6257 Epoch 145/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5572 - mae: 2.3610 - val_loss: 13.1675 - val_mae: 2.6557 Epoch 146/200 302/302 [==============================] - 0s 121us/sample - loss: 10.6458 - mae: 2.4057 - val_loss: 13.2535 - val_mae: 2.6634 Epoch 147/200 302/302 [==============================] - 0s 118us/sample - loss: 10.6893 - mae: 2.4135 - val_loss: 12.9759 - val_mae: 2.6171 Epoch 148/200 302/302 [==============================] - 0s 123us/sample - loss: 10.7185 - mae: 2.3963 - val_loss: 12.9606 - val_mae: 2.6141 Epoch 149/200 302/302 [==============================] - 0s 125us/sample - loss: 10.5228 - mae: 2.3725 - val_loss: 13.0335 - val_mae: 2.6258 Epoch 150/200 302/302 [==============================] - 0s 121us/sample - loss: 10.5729 - mae: 2.3630 - val_loss: 12.8634 - val_mae: 2.5955 Epoch 151/200 302/302 [==============================] - 0s 116us/sample - loss: 10.5181 - mae: 2.3878 - val_loss: 13.0577 - val_mae: 2.6473 Epoch 152/200 302/302 [==============================] - 0s 123us/sample - loss: 10.4218 - mae: 2.3603 - val_loss: 12.8188 - val_mae: 2.5929 Epoch 153/200 302/302 [==============================] - 0s 123us/sample - loss: 10.3815 - mae: 2.3323 - val_loss: 12.8714 - val_mae: 2.6117 Epoch 154/200 302/302 [==============================] - 0s 117us/sample - loss: 10.4583 - mae: 2.3938 - val_loss: 12.9568 - val_mae: 2.6207 Epoch 155/200 302/302 [==============================] - 0s 118us/sample - loss: 10.3705 - mae: 2.3388 - val_loss: 12.8654 - val_mae: 2.5925 Epoch 156/200 302/302 [==============================] - 0s 117us/sample - loss: 10.4837 - mae: 2.4080 - val_loss: 13.1291 - val_mae: 2.6384 Epoch 157/200 302/302 [==============================] - 0s 117us/sample - loss: 10.3349 - mae: 2.3452 - val_loss: 12.8158 - val_mae: 2.5837 Epoch 158/200 302/302 [==============================] - 0s 126us/sample - loss: 10.3006 - mae: 2.3460 - val_loss: 13.1154 - val_mae: 2.6459 Epoch 159/200 302/302 [==============================] - 0s 121us/sample - loss: 10.3137 - mae: 2.3509 - val_loss: 13.1255 - val_mae: 2.6376 Epoch 160/200 302/302 [==============================] - 0s 116us/sample - loss: 10.4750 - mae: 2.3712 - val_loss: 12.9281 - val_mae: 2.6166 Epoch 161/200 302/302 [==============================] - 0s 116us/sample - loss: 10.3240 - mae: 2.3444 - val_loss: 12.8943 - val_mae: 2.6041 Epoch 162/200 302/302 [==============================] - 0s 117us/sample - loss: 10.2648 - mae: 2.3367 - val_loss: 13.1012 - val_mae: 2.6381 Epoch 163/200 302/302 [==============================] - 0s 120us/sample - loss: 10.3119 - mae: 2.3639 - val_loss: 12.9429 - val_mae: 2.6160 Epoch 164/200 302/302 [==============================] - 0s 122us/sample - loss: 10.3079 - mae: 2.3614 - val_loss: 13.3466 - val_mae: 2.6598 Epoch 165/200 302/302 [==============================] - 0s 117us/sample - loss: 10.4094 - mae: 2.3426 - val_loss: 12.9133 - val_mae: 2.6089 Epoch 166/200 302/302 [==============================] - 0s 119us/sample - loss: 10.2519 - mae: 2.3584 - val_loss: 12.8800 - val_mae: 2.6066 Epoch 167/200 302/302 [==============================] - 0s 119us/sample - loss: 10.2046 - mae: 2.3590 - val_loss: 12.9090 - val_mae: 2.5985 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 39us/sample - loss: 8.9567 - mae: 2.3066 [CV] END learning_rate=0.002049356284410796, n_hidden=1, n_neurons=13; total time= 6.9s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 0s 2ms/sample - loss: 604.4992 - mae: 22.1611 - val_loss: 555.3366 - val_mae: 21.4093 Epoch 2/200 302/302 [==============================] - 0s 121us/sample - loss: 548.4939 - mae: 21.4953 - val_loss: 508.8101 - val_mae: 20.7884 Epoch 3/200 302/302 [==============================] - 0s 126us/sample - loss: 500.1128 - mae: 20.7759 - val_loss: 460.7651 - val_mae: 20.0275 Epoch 4/200 302/302 [==============================] - 0s 122us/sample - loss: 449.7610 - mae: 19.9199 - val_loss: 411.4396 - val_mae: 19.1076 Epoch 5/200 302/302 [==============================] - 0s 121us/sample - loss: 400.6731 - mae: 18.9287 - val_loss: 364.3820 - val_mae: 18.0782 Epoch 6/200 302/302 [==============================] - 0s 122us/sample - loss: 352.7241 - mae: 17.8312 - val_loss: 320.2556 - val_mae: 16.9567 Epoch 7/200 302/302 [==============================] - 0s 127us/sample - loss: 305.6748 - mae: 16.5929 - val_loss: 274.5082 - val_mae: 15.6809 Epoch 8/200 302/302 [==============================] - 0s 126us/sample - loss: 258.4031 - mae: 15.1789 - val_loss: 230.6448 - val_mae: 14.2780 Epoch 9/200 302/302 [==============================] - 0s 123us/sample - loss: 214.4541 - mae: 13.6702 - val_loss: 189.0127 - val_mae: 12.7593 Epoch 10/200 302/302 [==============================] - 0s 118us/sample - loss: 173.3049 - mae: 12.1190 - val_loss: 151.8410 - val_mae: 11.2293 Epoch 11/200 302/302 [==============================] - 0s 120us/sample - loss: 138.2476 - mae: 10.5838 - val_loss: 120.2023 - val_mae: 9.7659 Epoch 12/200 302/302 [==============================] - 0s 120us/sample - loss: 107.9737 - mae: 9.1740 - val_loss: 93.3078 - val_mae: 8.4101 Epoch 13/200 302/302 [==============================] - 0s 121us/sample - loss: 82.7683 - mae: 7.9119 - val_loss: 71.1327 - val_mae: 7.1512 Epoch 14/200 302/302 [==============================] - 0s 120us/sample - loss: 62.2490 - mae: 6.7124 - val_loss: 53.5268 - val_mae: 6.0121 Epoch 15/200 302/302 [==============================] - 0s 118us/sample - loss: 46.2976 - mae: 5.6687 - val_loss: 40.1968 - val_mae: 5.0178 Epoch 16/200 302/302 [==============================] - 0s 119us/sample - loss: 34.4866 - mae: 4.7470 - val_loss: 30.9497 - val_mae: 4.2381 Epoch 17/200 302/302 [==============================] - 0s 120us/sample - loss: 26.5402 - mae: 4.1534 - val_loss: 25.6959 - val_mae: 3.7352 Epoch 18/200 302/302 [==============================] - 0s 123us/sample - loss: 21.5672 - mae: 3.6310 - val_loss: 23.0694 - val_mae: 3.5256 Epoch 19/200 302/302 [==============================] - 0s 125us/sample - loss: 19.2953 - mae: 3.3506 - val_loss: 21.8942 - val_mae: 3.4570 Epoch 20/200 302/302 [==============================] - 0s 125us/sample - loss: 18.1858 - mae: 3.2344 - val_loss: 21.9660 - val_mae: 3.4935 Epoch 21/200 302/302 [==============================] - 0s 120us/sample - loss: 17.5005 - mae: 3.1651 - val_loss: 22.0131 - val_mae: 3.5676 Epoch 22/200 302/302 [==============================] - 0s 122us/sample - loss: 17.0748 - mae: 3.0933 - val_loss: 21.2929 - val_mae: 3.4913 Epoch 23/200 302/302 [==============================] - 0s 119us/sample - loss: 16.7485 - mae: 3.0772 - val_loss: 21.2165 - val_mae: 3.5005 Epoch 24/200 302/302 [==============================] - 0s 120us/sample - loss: 16.3258 - mae: 3.0277 - val_loss: 20.9085 - val_mae: 3.4500 Epoch 25/200 302/302 [==============================] - 0s 120us/sample - loss: 15.9845 - mae: 3.0171 - val_loss: 21.3638 - val_mae: 3.5257 Epoch 26/200 302/302 [==============================] - 0s 118us/sample - loss: 15.8533 - mae: 2.9565 - val_loss: 20.7013 - val_mae: 3.4119 Epoch 27/200 302/302 [==============================] - 0s 121us/sample - loss: 15.8705 - mae: 2.9535 - val_loss: 20.8697 - val_mae: 3.4084 Epoch 28/200 302/302 [==============================] - 0s 119us/sample - loss: 15.5398 - mae: 2.9564 - val_loss: 20.5405 - val_mae: 3.4072 Epoch 29/200 302/302 [==============================] - 0s 118us/sample - loss: 15.0790 - mae: 2.8786 - val_loss: 20.3729 - val_mae: 3.3992 Epoch 30/200 302/302 [==============================] - 0s 121us/sample - loss: 15.1966 - mae: 2.8954 - val_loss: 20.1251 - val_mae: 3.3663 Epoch 31/200 302/302 [==============================] - 0s 121us/sample - loss: 15.2611 - mae: 2.9066 - val_loss: 20.0632 - val_mae: 3.3286 Epoch 32/200 302/302 [==============================] - 0s 121us/sample - loss: 14.9981 - mae: 2.8929 - val_loss: 20.2450 - val_mae: 3.4221 Epoch 33/200 302/302 [==============================] - 0s 119us/sample - loss: 14.7820 - mae: 2.8524 - val_loss: 20.0572 - val_mae: 3.3886 Epoch 34/200 302/302 [==============================] - 0s 123us/sample - loss: 14.5959 - mae: 2.8444 - val_loss: 19.7340 - val_mae: 3.3037 Epoch 35/200 302/302 [==============================] - 0s 123us/sample - loss: 14.5514 - mae: 2.8152 - val_loss: 19.7427 - val_mae: 3.2855 Epoch 36/200 302/302 [==============================] - 0s 127us/sample - loss: 14.4185 - mae: 2.8364 - val_loss: 19.5252 - val_mae: 3.3124 Epoch 37/200 302/302 [==============================] - 0s 121us/sample - loss: 14.2360 - mae: 2.7784 - val_loss: 19.4199 - val_mae: 3.2502 Epoch 38/200 302/302 [==============================] - 0s 121us/sample - loss: 14.1463 - mae: 2.7746 - val_loss: 19.3777 - val_mae: 3.2794 Epoch 39/200 302/302 [==============================] - 0s 120us/sample - loss: 13.9813 - mae: 2.7762 - val_loss: 19.5745 - val_mae: 3.3214 Epoch 40/200 302/302 [==============================] - 0s 122us/sample - loss: 14.0009 - mae: 2.7618 - val_loss: 19.3478 - val_mae: 3.2849 Epoch 41/200 302/302 [==============================] - 0s 121us/sample - loss: 13.8287 - mae: 2.7437 - val_loss: 19.0138 - val_mae: 3.2516 Epoch 42/200 302/302 [==============================] - 0s 120us/sample - loss: 13.9083 - mae: 2.7550 - val_loss: 19.0563 - val_mae: 3.2636 Epoch 43/200 302/302 [==============================] - 0s 120us/sample - loss: 13.6039 - mae: 2.7269 - val_loss: 18.8975 - val_mae: 3.2700 Epoch 44/200 302/302 [==============================] - 0s 122us/sample - loss: 13.5265 - mae: 2.7137 - val_loss: 18.7443 - val_mae: 3.2220 Epoch 45/200 302/302 [==============================] - 0s 121us/sample - loss: 13.4180 - mae: 2.6714 - val_loss: 18.7204 - val_mae: 3.1934 Epoch 46/200 302/302 [==============================] - 0s 120us/sample - loss: 13.4069 - mae: 2.7044 - val_loss: 18.8423 - val_mae: 3.2623 Epoch 47/200 302/302 [==============================] - 0s 121us/sample - loss: 13.3042 - mae: 2.6915 - val_loss: 18.6326 - val_mae: 3.2195 Epoch 48/200 302/302 [==============================] - 0s 123us/sample - loss: 13.2836 - mae: 2.6845 - val_loss: 18.5435 - val_mae: 3.2394 Epoch 49/200 302/302 [==============================] - 0s 123us/sample - loss: 13.3770 - mae: 2.6953 - val_loss: 18.2681 - val_mae: 3.1718 Epoch 50/200 302/302 [==============================] - 0s 120us/sample - loss: 12.9757 - mae: 2.6301 - val_loss: 18.3524 - val_mae: 3.1426 Epoch 51/200 302/302 [==============================] - 0s 118us/sample - loss: 13.2373 - mae: 2.6966 - val_loss: 18.8062 - val_mae: 3.2173 Epoch 52/200 302/302 [==============================] - 0s 120us/sample - loss: 12.8958 - mae: 2.6335 - val_loss: 18.3701 - val_mae: 3.1488 Epoch 53/200 302/302 [==============================] - 0s 123us/sample - loss: 12.9962 - mae: 2.6621 - val_loss: 18.4266 - val_mae: 3.2507 Epoch 54/200 302/302 [==============================] - 0s 120us/sample - loss: 12.8376 - mae: 2.6068 - val_loss: 18.0987 - val_mae: 3.1082 Epoch 55/200 302/302 [==============================] - 0s 120us/sample - loss: 12.8889 - mae: 2.6329 - val_loss: 18.0621 - val_mae: 3.1575 Epoch 56/200 302/302 [==============================] - 0s 121us/sample - loss: 12.8163 - mae: 2.6221 - val_loss: 17.7478 - val_mae: 3.0950 Epoch 57/200 302/302 [==============================] - 0s 127us/sample - loss: 12.6758 - mae: 2.6041 - val_loss: 17.7312 - val_mae: 3.0973 Epoch 58/200 302/302 [==============================] - 0s 122us/sample - loss: 12.5264 - mae: 2.5899 - val_loss: 17.5851 - val_mae: 3.1276 Epoch 59/200 302/302 [==============================] - 0s 119us/sample - loss: 12.5288 - mae: 2.5937 - val_loss: 17.5407 - val_mae: 3.0582 Epoch 60/200 302/302 [==============================] - 0s 122us/sample - loss: 12.5185 - mae: 2.5882 - val_loss: 17.4001 - val_mae: 3.0528 Epoch 61/200 302/302 [==============================] - 0s 126us/sample - loss: 12.3179 - mae: 2.5563 - val_loss: 17.2164 - val_mae: 3.0316 Epoch 62/200 302/302 [==============================] - 0s 122us/sample - loss: 12.3176 - mae: 2.5796 - val_loss: 17.4312 - val_mae: 3.0763 Epoch 63/200 302/302 [==============================] - 0s 122us/sample - loss: 12.1644 - mae: 2.5602 - val_loss: 17.2986 - val_mae: 3.0717 Epoch 64/200 302/302 [==============================] - 0s 123us/sample - loss: 12.0847 - mae: 2.5479 - val_loss: 17.0071 - val_mae: 3.0391 Epoch 65/200 302/302 [==============================] - 0s 122us/sample - loss: 12.2244 - mae: 2.5515 - val_loss: 17.2340 - val_mae: 3.0308 Epoch 66/200 302/302 [==============================] - 0s 122us/sample - loss: 12.3031 - mae: 2.5657 - val_loss: 17.4988 - val_mae: 3.0906 Epoch 67/200 302/302 [==============================] - 0s 116us/sample - loss: 11.8860 - mae: 2.5347 - val_loss: 16.9685 - val_mae: 3.0495 Epoch 68/200 302/302 [==============================] - 0s 118us/sample - loss: 11.9138 - mae: 2.5167 - val_loss: 17.1217 - val_mae: 3.0490 Epoch 69/200 302/302 [==============================] - 0s 123us/sample - loss: 11.8894 - mae: 2.4996 - val_loss: 16.8829 - val_mae: 3.0001 Epoch 70/200 302/302 [==============================] - 0s 120us/sample - loss: 11.8306 - mae: 2.5300 - val_loss: 17.0142 - val_mae: 3.0608 Epoch 71/200 302/302 [==============================] - 0s 121us/sample - loss: 11.9483 - mae: 2.5356 - val_loss: 16.8980 - val_mae: 2.9992 Epoch 72/200 302/302 [==============================] - 0s 122us/sample - loss: 11.7981 - mae: 2.5010 - val_loss: 16.7010 - val_mae: 2.9836 Epoch 73/200 302/302 [==============================] - 0s 122us/sample - loss: 11.6795 - mae: 2.4859 - val_loss: 16.6375 - val_mae: 2.9727 Epoch 74/200 302/302 [==============================] - 0s 122us/sample - loss: 11.5485 - mae: 2.4730 - val_loss: 17.1572 - val_mae: 3.1094 Epoch 75/200 302/302 [==============================] - 0s 122us/sample - loss: 11.7209 - mae: 2.4882 - val_loss: 16.5954 - val_mae: 2.9766 Epoch 76/200 302/302 [==============================] - 0s 119us/sample - loss: 11.5145 - mae: 2.4562 - val_loss: 16.5346 - val_mae: 2.9621 Epoch 77/200 302/302 [==============================] - 0s 121us/sample - loss: 11.4659 - mae: 2.4685 - val_loss: 16.7426 - val_mae: 3.0020 Epoch 78/200 302/302 [==============================] - 0s 123us/sample - loss: 11.4810 - mae: 2.4506 - val_loss: 16.5983 - val_mae: 2.9694 Epoch 79/200 302/302 [==============================] - 0s 120us/sample - loss: 11.2863 - mae: 2.4529 - val_loss: 16.3990 - val_mae: 2.9808 Epoch 80/200 302/302 [==============================] - 0s 120us/sample - loss: 11.3853 - mae: 2.4549 - val_loss: 16.5340 - val_mae: 2.9712 Epoch 81/200 302/302 [==============================] - 0s 132us/sample - loss: 11.3264 - mae: 2.4436 - val_loss: 16.6350 - val_mae: 3.0197 Epoch 82/200 302/302 [==============================] - 0s 132us/sample - loss: 11.3855 - mae: 2.4764 - val_loss: 16.3206 - val_mae: 2.9690 Epoch 83/200 302/302 [==============================] - 0s 128us/sample - loss: 11.2071 - mae: 2.4071 - val_loss: 16.1258 - val_mae: 2.9093 Epoch 84/200 302/302 [==============================] - 0s 123us/sample - loss: 11.4336 - mae: 2.4996 - val_loss: 16.8290 - val_mae: 3.0865 Epoch 85/200 302/302 [==============================] - 0s 117us/sample - loss: 11.5055 - mae: 2.4481 - val_loss: 15.9925 - val_mae: 2.9258 Epoch 86/200 302/302 [==============================] - 0s 123us/sample - loss: 11.1199 - mae: 2.4214 - val_loss: 16.3526 - val_mae: 3.0030 Epoch 87/200 302/302 [==============================] - 0s 120us/sample - loss: 11.4221 - mae: 2.4486 - val_loss: 16.0480 - val_mae: 2.9396 Epoch 88/200 302/302 [==============================] - 0s 118us/sample - loss: 11.0975 - mae: 2.4145 - val_loss: 16.1399 - val_mae: 2.9163 Epoch 89/200 302/302 [==============================] - 0s 122us/sample - loss: 11.1358 - mae: 2.4254 - val_loss: 16.0798 - val_mae: 2.9313 Epoch 90/200 302/302 [==============================] - 0s 124us/sample - loss: 11.0111 - mae: 2.4169 - val_loss: 15.9458 - val_mae: 2.9617 Epoch 91/200 302/302 [==============================] - 0s 121us/sample - loss: 11.1988 - mae: 2.4478 - val_loss: 16.0170 - val_mae: 2.9357 Epoch 92/200 302/302 [==============================] - 0s 121us/sample - loss: 11.1260 - mae: 2.4455 - val_loss: 15.7936 - val_mae: 2.9013 Epoch 93/200 302/302 [==============================] - 0s 123us/sample - loss: 11.0709 - mae: 2.4101 - val_loss: 15.9334 - val_mae: 2.9275 Epoch 94/200 302/302 [==============================] - 0s 120us/sample - loss: 10.9251 - mae: 2.4017 - val_loss: 15.7826 - val_mae: 2.8963 Epoch 95/200 302/302 [==============================] - 0s 122us/sample - loss: 11.1890 - mae: 2.4238 - val_loss: 16.1357 - val_mae: 2.9871 Epoch 96/200 302/302 [==============================] - 0s 122us/sample - loss: 10.9187 - mae: 2.4064 - val_loss: 15.6999 - val_mae: 2.9199 Epoch 97/200 302/302 [==============================] - 0s 120us/sample - loss: 10.9621 - mae: 2.3875 - val_loss: 15.7570 - val_mae: 2.8717 Epoch 98/200 302/302 [==============================] - 0s 124us/sample - loss: 10.7913 - mae: 2.3898 - val_loss: 15.6462 - val_mae: 2.8828 Epoch 99/200 302/302 [==============================] - 0s 121us/sample - loss: 11.0068 - mae: 2.4221 - val_loss: 15.7871 - val_mae: 2.9097 Epoch 100/200 302/302 [==============================] - 0s 119us/sample - loss: 10.9290 - mae: 2.4011 - val_loss: 16.0974 - val_mae: 2.9454 Epoch 101/200 302/302 [==============================] - 0s 119us/sample - loss: 10.8540 - mae: 2.4022 - val_loss: 16.1220 - val_mae: 2.9746 Epoch 102/200 302/302 [==============================] - 0s 122us/sample - loss: 10.8753 - mae: 2.4142 - val_loss: 15.8332 - val_mae: 2.9187 Epoch 103/200 302/302 [==============================] - 0s 121us/sample - loss: 10.8430 - mae: 2.3845 - val_loss: 15.7542 - val_mae: 2.9113 Epoch 104/200 302/302 [==============================] - 0s 121us/sample - loss: 10.8571 - mae: 2.3809 - val_loss: 15.4635 - val_mae: 2.8540 Epoch 105/200 302/302 [==============================] - 0s 122us/sample - loss: 10.8806 - mae: 2.3764 - val_loss: 15.2936 - val_mae: 2.8277 Epoch 106/200 302/302 [==============================] - 0s 118us/sample - loss: 10.7247 - mae: 2.3758 - val_loss: 15.6302 - val_mae: 2.9006 Epoch 107/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5867 - mae: 2.3649 - val_loss: 15.4603 - val_mae: 2.8525 Epoch 108/200 302/302 [==============================] - 0s 123us/sample - loss: 10.7977 - mae: 2.4132 - val_loss: 15.5508 - val_mae: 2.8804 Epoch 109/200 302/302 [==============================] - 0s 121us/sample - loss: 10.7094 - mae: 2.3692 - val_loss: 15.4701 - val_mae: 2.8643 Epoch 110/200 302/302 [==============================] - 0s 119us/sample - loss: 10.5582 - mae: 2.3575 - val_loss: 16.0270 - val_mae: 2.9758 Epoch 111/200 302/302 [==============================] - 0s 122us/sample - loss: 10.5711 - mae: 2.3651 - val_loss: 15.5320 - val_mae: 2.8526 Epoch 112/200 302/302 [==============================] - 0s 123us/sample - loss: 11.0412 - mae: 2.3817 - val_loss: 15.4794 - val_mae: 2.8745 Epoch 113/200 302/302 [==============================] - 0s 121us/sample - loss: 10.5472 - mae: 2.3565 - val_loss: 15.3790 - val_mae: 2.8663 Epoch 114/200 302/302 [==============================] - 0s 124us/sample - loss: 10.5466 - mae: 2.3699 - val_loss: 15.5092 - val_mae: 2.8689 Epoch 115/200 302/302 [==============================] - 0s 121us/sample - loss: 10.4926 - mae: 2.3558 - val_loss: 15.9035 - val_mae: 2.9529 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 43us/sample - loss: 14.3781 - mae: 2.9872 [CV] END learning_rate=0.002049356284410796, n_hidden=1, n_neurons=13; total time= 4.8s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 436.3215 - mae: 19.4563 - val_loss: 119.3636 - val_mae: 9.8327 Epoch 2/200 301/301 [==============================] - 0s 190us/sample - loss: 51.6862 - mae: 5.6391 - val_loss: 28.8983 - val_mae: 4.1538 Epoch 3/200 301/301 [==============================] - 0s 192us/sample - loss: 24.7237 - mae: 3.6613 - val_loss: 19.0204 - val_mae: 2.9883 Epoch 4/200 301/301 [==============================] - 0s 187us/sample - loss: 22.2949 - mae: 3.3631 - val_loss: 20.6761 - val_mae: 3.2179 Epoch 5/200 301/301 [==============================] - 0s 191us/sample - loss: 17.1908 - mae: 2.9725 - val_loss: 25.0931 - val_mae: 3.5986 Epoch 6/200 301/301 [==============================] - 0s 189us/sample - loss: 18.3760 - mae: 3.0690 - val_loss: 18.1143 - val_mae: 3.0552 Epoch 7/200 301/301 [==============================] - 0s 192us/sample - loss: 16.0331 - mae: 2.8741 - val_loss: 16.7122 - val_mae: 2.7707 Epoch 8/200 301/301 [==============================] - 0s 187us/sample - loss: 17.2319 - mae: 2.8949 - val_loss: 26.6392 - val_mae: 3.8376 Epoch 9/200 301/301 [==============================] - 0s 194us/sample - loss: 14.3182 - mae: 2.7855 - val_loss: 15.6975 - val_mae: 2.7961 Epoch 10/200 301/301 [==============================] - 0s 188us/sample - loss: 15.6166 - mae: 2.9033 - val_loss: 17.4454 - val_mae: 2.8503 Epoch 11/200 301/301 [==============================] - 0s 183us/sample - loss: 13.7467 - mae: 2.6442 - val_loss: 21.7240 - val_mae: 3.2192 Epoch 12/200 301/301 [==============================] - 0s 182us/sample - loss: 13.8662 - mae: 2.6816 - val_loss: 16.2227 - val_mae: 2.7521 Epoch 13/200 301/301 [==============================] - 0s 176us/sample - loss: 14.2627 - mae: 2.6958 - val_loss: 16.4109 - val_mae: 2.8439 Epoch 14/200 301/301 [==============================] - 0s 186us/sample - loss: 12.2547 - mae: 2.4599 - val_loss: 25.5489 - val_mae: 3.5886 Epoch 15/200 301/301 [==============================] - 0s 191us/sample - loss: 12.8025 - mae: 2.5882 - val_loss: 13.8562 - val_mae: 2.5291 Epoch 16/200 301/301 [==============================] - 0s 188us/sample - loss: 12.4537 - mae: 2.5019 - val_loss: 16.2996 - val_mae: 2.9236 Epoch 17/200 301/301 [==============================] - 0s 192us/sample - loss: 13.6588 - mae: 2.5914 - val_loss: 18.9857 - val_mae: 3.0447 Epoch 18/200 301/301 [==============================] - 0s 184us/sample - loss: 13.5467 - mae: 2.6207 - val_loss: 19.9695 - val_mae: 3.1463 Epoch 19/200 301/301 [==============================] - 0s 190us/sample - loss: 15.5460 - mae: 2.8304 - val_loss: 13.7820 - val_mae: 2.5440 Epoch 20/200 301/301 [==============================] - 0s 188us/sample - loss: 13.3585 - mae: 2.6753 - val_loss: 16.1577 - val_mae: 2.6615 Epoch 21/200 301/301 [==============================] - 0s 186us/sample - loss: 10.9588 - mae: 2.3595 - val_loss: 17.7406 - val_mae: 2.7899 Epoch 22/200 301/301 [==============================] - 0s 190us/sample - loss: 12.7274 - mae: 2.5134 - val_loss: 17.6521 - val_mae: 2.9511 Epoch 23/200 301/301 [==============================] - 0s 183us/sample - loss: 11.2953 - mae: 2.3403 - val_loss: 21.9881 - val_mae: 3.0808 Epoch 24/200 301/301 [==============================] - 0s 196us/sample - loss: 12.9750 - mae: 2.5377 - val_loss: 17.1879 - val_mae: 2.9904 Epoch 25/200 301/301 [==============================] - 0s 186us/sample - loss: 11.9302 - mae: 2.3986 - val_loss: 15.1274 - val_mae: 2.7324 Epoch 26/200 301/301 [==============================] - 0s 191us/sample - loss: 10.7676 - mae: 2.3072 - val_loss: 16.5071 - val_mae: 2.9270 Epoch 27/200 301/301 [==============================] - 0s 188us/sample - loss: 11.6526 - mae: 2.3499 - val_loss: 14.7776 - val_mae: 2.6635 Epoch 28/200 301/301 [==============================] - 0s 192us/sample - loss: 11.9752 - mae: 2.4076 - val_loss: 19.6420 - val_mae: 3.0767 Epoch 29/200 301/301 [==============================] - 0s 189us/sample - loss: 11.3139 - mae: 2.3589 - val_loss: 16.2672 - val_mae: 2.8823 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 60us/sample - loss: 14.0550 - mae: 2.7880 [CV] END learning_rate=0.0036089914604343796, n_hidden=2, n_neurons=60; total time= 2.4s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 3ms/sample - loss: 366.4272 - mae: 17.4738 - val_loss: 58.0729 - val_mae: 6.3222 Epoch 2/200 301/301 [==============================] - 0s 189us/sample - loss: 34.8273 - mae: 4.5621 - val_loss: 23.5196 - val_mae: 3.3403 Epoch 3/200 301/301 [==============================] - 0s 202us/sample - loss: 20.6572 - mae: 3.3937 - val_loss: 25.1787 - val_mae: 3.6439 Epoch 4/200 301/301 [==============================] - 0s 193us/sample - loss: 22.1591 - mae: 3.4139 - val_loss: 20.1290 - val_mae: 3.0794 Epoch 5/200 301/301 [==============================] - 0s 195us/sample - loss: 19.4870 - mae: 3.2073 - val_loss: 15.9833 - val_mae: 2.7717 Epoch 6/200 301/301 [==============================] - 0s 190us/sample - loss: 15.6648 - mae: 2.8458 - val_loss: 18.6201 - val_mae: 3.0242 Epoch 7/200 301/301 [==============================] - 0s 205us/sample - loss: 17.6344 - mae: 2.9692 - val_loss: 15.8651 - val_mae: 2.8374 Epoch 8/200 301/301 [==============================] - 0s 201us/sample - loss: 14.9024 - mae: 2.7692 - val_loss: 20.8575 - val_mae: 3.3206 Epoch 9/200 301/301 [==============================] - 0s 192us/sample - loss: 17.5945 - mae: 2.9992 - val_loss: 23.5056 - val_mae: 3.5928 Epoch 10/200 301/301 [==============================] - 0s 196us/sample - loss: 14.6439 - mae: 2.7729 - val_loss: 15.0503 - val_mae: 2.7030 Epoch 11/200 301/301 [==============================] - 0s 187us/sample - loss: 13.3425 - mae: 2.6077 - val_loss: 16.5323 - val_mae: 2.9096 Epoch 12/200 301/301 [==============================] - 0s 187us/sample - loss: 13.6970 - mae: 2.6596 - val_loss: 16.9428 - val_mae: 3.0543 Epoch 13/200 301/301 [==============================] - 0s 184us/sample - loss: 13.0199 - mae: 2.6127 - val_loss: 32.3210 - val_mae: 4.2503 Epoch 14/200 301/301 [==============================] - 0s 200us/sample - loss: 16.0274 - mae: 2.8404 - val_loss: 17.9893 - val_mae: 3.1744 Epoch 15/200 301/301 [==============================] - 0s 189us/sample - loss: 12.0023 - mae: 2.4449 - val_loss: 18.9058 - val_mae: 3.0759 Epoch 16/200 301/301 [==============================] - 0s 189us/sample - loss: 14.5151 - mae: 2.7263 - val_loss: 14.7729 - val_mae: 2.7839 Epoch 17/200 301/301 [==============================] - 0s 186us/sample - loss: 12.3684 - mae: 2.5287 - val_loss: 14.3472 - val_mae: 2.7775 Epoch 18/200 301/301 [==============================] - 0s 194us/sample - loss: 11.3309 - mae: 2.4010 - val_loss: 15.1023 - val_mae: 2.6348 Epoch 19/200 301/301 [==============================] - 0s 191us/sample - loss: 10.9226 - mae: 2.3337 - val_loss: 15.6159 - val_mae: 2.6782 Epoch 20/200 301/301 [==============================] - 0s 187us/sample - loss: 12.0966 - mae: 2.4171 - val_loss: 17.1192 - val_mae: 3.0764 Epoch 21/200 301/301 [==============================] - 0s 189us/sample - loss: 11.7686 - mae: 2.4731 - val_loss: 18.2728 - val_mae: 3.0483 Epoch 22/200 301/301 [==============================] - 0s 185us/sample - loss: 11.0976 - mae: 2.3651 - val_loss: 22.0707 - val_mae: 3.1649 Epoch 23/200 301/301 [==============================] - 0s 194us/sample - loss: 12.3527 - mae: 2.5754 - val_loss: 15.5489 - val_mae: 2.6271 Epoch 24/200 301/301 [==============================] - 0s 188us/sample - loss: 14.9181 - mae: 2.7134 - val_loss: 14.0137 - val_mae: 2.5843 Epoch 25/200 301/301 [==============================] - 0s 193us/sample - loss: 11.8800 - mae: 2.5265 - val_loss: 22.4527 - val_mae: 3.2533 Epoch 26/200 301/301 [==============================] - 0s 195us/sample - loss: 12.5127 - mae: 2.6807 - val_loss: 14.1501 - val_mae: 2.5898 Epoch 27/200 301/301 [==============================] - 0s 194us/sample - loss: 12.4345 - mae: 2.5151 - val_loss: 14.3030 - val_mae: 2.6560 Epoch 28/200 301/301 [==============================] - 0s 182us/sample - loss: 9.8428 - mae: 2.2214 - val_loss: 17.5380 - val_mae: 3.2427 Epoch 29/200 301/301 [==============================] - 0s 192us/sample - loss: 11.5813 - mae: 2.4729 - val_loss: 29.6386 - val_mae: 3.8677 Epoch 30/200 301/301 [==============================] - 0s 189us/sample - loss: 14.8094 - mae: 2.7359 - val_loss: 20.8373 - val_mae: 3.4004 Epoch 31/200 301/301 [==============================] - 0s 194us/sample - loss: 11.8223 - mae: 2.4313 - val_loss: 15.6578 - val_mae: 2.6864 Epoch 32/200 301/301 [==============================] - 0s 180us/sample - loss: 10.2228 - mae: 2.1798 - val_loss: 12.5772 - val_mae: 2.4166 Epoch 33/200 301/301 [==============================] - 0s 190us/sample - loss: 9.6284 - mae: 2.1921 - val_loss: 14.5751 - val_mae: 2.7911 Epoch 34/200 301/301 [==============================] - 0s 189us/sample - loss: 11.4581 - mae: 2.4469 - val_loss: 16.9545 - val_mae: 2.8519 Epoch 35/200 301/301 [==============================] - 0s 184us/sample - loss: 12.5456 - mae: 2.4360 - val_loss: 14.4745 - val_mae: 2.5820 Epoch 36/200 301/301 [==============================] - 0s 198us/sample - loss: 10.3579 - mae: 2.3338 - val_loss: 14.5705 - val_mae: 2.5016 Epoch 37/200 301/301 [==============================] - 0s 186us/sample - loss: 10.2379 - mae: 2.1799 - val_loss: 15.5360 - val_mae: 2.8904 Epoch 38/200 301/301 [==============================] - 0s 196us/sample - loss: 9.3261 - mae: 2.1844 - val_loss: 13.8172 - val_mae: 2.6327 Epoch 39/200 301/301 [==============================] - 0s 179us/sample - loss: 10.5067 - mae: 2.3110 - val_loss: 20.6171 - val_mae: 3.2024 Epoch 40/200 301/301 [==============================] - 0s 181us/sample - loss: 10.4334 - mae: 2.2732 - val_loss: 12.8973 - val_mae: 2.4485 Epoch 41/200 301/301 [==============================] - 0s 189us/sample - loss: 9.4417 - mae: 2.1279 - val_loss: 16.4285 - val_mae: 2.9874 Epoch 42/200 301/301 [==============================] - 0s 184us/sample - loss: 10.6818 - mae: 2.2862 - val_loss: 15.1514 - val_mae: 2.7290 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 68us/sample - loss: 17.7443 - mae: 2.8820 [CV] END learning_rate=0.0036089914604343796, n_hidden=2, n_neurons=60; total time= 3.3s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 357.6162 - mae: 16.8402 - val_loss: 72.5162 - val_mae: 6.5139 Epoch 2/200 302/302 [==============================] - 0s 194us/sample - loss: 63.1334 - mae: 5.8951 - val_loss: 62.9730 - val_mae: 5.7579 Epoch 3/200 302/302 [==============================] - 0s 189us/sample - loss: 37.9289 - mae: 4.4962 - val_loss: 32.6256 - val_mae: 4.1428 Epoch 4/200 302/302 [==============================] - 0s 195us/sample - loss: 29.0864 - mae: 3.8796 - val_loss: 28.1438 - val_mae: 3.7421 Epoch 5/200 302/302 [==============================] - 0s 190us/sample - loss: 21.7928 - mae: 3.2917 - val_loss: 21.6127 - val_mae: 3.3042 Epoch 6/200 302/302 [==============================] - 0s 196us/sample - loss: 16.5863 - mae: 2.8229 - val_loss: 20.7484 - val_mae: 3.1836 Epoch 7/200 302/302 [==============================] - 0s 258us/sample - loss: 17.6151 - mae: 2.9625 - val_loss: 17.9706 - val_mae: 3.2084 Epoch 8/200 302/302 [==============================] - 0s 263us/sample - loss: 15.1465 - mae: 2.8057 - val_loss: 16.6155 - val_mae: 2.8919 Epoch 9/200 302/302 [==============================] - 0s 244us/sample - loss: 15.2091 - mae: 2.7859 - val_loss: 21.1588 - val_mae: 3.1650 Epoch 10/200 302/302 [==============================] - 0s 194us/sample - loss: 15.7434 - mae: 2.7533 - val_loss: 17.2234 - val_mae: 2.9964 Epoch 11/200 302/302 [==============================] - 0s 199us/sample - loss: 17.4188 - mae: 2.9732 - val_loss: 16.2093 - val_mae: 2.9041 Epoch 12/200 302/302 [==============================] - 0s 193us/sample - loss: 13.1492 - mae: 2.5779 - val_loss: 19.6592 - val_mae: 3.0490 Epoch 13/200 302/302 [==============================] - 0s 196us/sample - loss: 15.6233 - mae: 2.8101 - val_loss: 25.2876 - val_mae: 3.7397 Epoch 14/200 302/302 [==============================] - 0s 202us/sample - loss: 15.5336 - mae: 2.8366 - val_loss: 18.5799 - val_mae: 3.1950 Epoch 15/200 302/302 [==============================] - 0s 188us/sample - loss: 11.6836 - mae: 2.4306 - val_loss: 16.4582 - val_mae: 2.9384 Epoch 16/200 302/302 [==============================] - 0s 198us/sample - loss: 15.7849 - mae: 2.8241 - val_loss: 14.7133 - val_mae: 2.7043 Epoch 17/200 302/302 [==============================] - 0s 193us/sample - loss: 10.5167 - mae: 2.2922 - val_loss: 14.7942 - val_mae: 2.6097 Epoch 18/200 302/302 [==============================] - 0s 200us/sample - loss: 12.6507 - mae: 2.5009 - val_loss: 15.4172 - val_mae: 2.6768 Epoch 19/200 302/302 [==============================] - 0s 197us/sample - loss: 12.6962 - mae: 2.5264 - val_loss: 16.3852 - val_mae: 2.7953 Epoch 20/200 302/302 [==============================] - 0s 187us/sample - loss: 12.6310 - mae: 2.4825 - val_loss: 16.1440 - val_mae: 2.7859 Epoch 21/200 302/302 [==============================] - 0s 189us/sample - loss: 11.4543 - mae: 2.4265 - val_loss: 15.5057 - val_mae: 2.7323 Epoch 22/200 302/302 [==============================] - 0s 187us/sample - loss: 12.8262 - mae: 2.5383 - val_loss: 17.7497 - val_mae: 2.8616 Epoch 23/200 302/302 [==============================] - 0s 185us/sample - loss: 11.7090 - mae: 2.4818 - val_loss: 18.1677 - val_mae: 2.9470 Epoch 24/200 302/302 [==============================] - 0s 186us/sample - loss: 13.0490 - mae: 2.5916 - val_loss: 16.0468 - val_mae: 2.7919 Epoch 25/200 302/302 [==============================] - 0s 187us/sample - loss: 11.7384 - mae: 2.4427 - val_loss: 16.3525 - val_mae: 2.6303 Epoch 26/200 302/302 [==============================] - 0s 188us/sample - loss: 14.4655 - mae: 2.7142 - val_loss: 14.2088 - val_mae: 2.5569 Epoch 27/200 302/302 [==============================] - 0s 186us/sample - loss: 10.8709 - mae: 2.2836 - val_loss: 15.5375 - val_mae: 2.7746 Epoch 28/200 302/302 [==============================] - 0s 185us/sample - loss: 11.6142 - mae: 2.5160 - val_loss: 14.0966 - val_mae: 2.6385 Epoch 29/200 302/302 [==============================] - 0s 186us/sample - loss: 10.5270 - mae: 2.2344 - val_loss: 16.3336 - val_mae: 2.9059 Epoch 30/200 302/302 [==============================] - 0s 175us/sample - loss: 11.0131 - mae: 2.2891 - val_loss: 15.8435 - val_mae: 2.7527 Epoch 31/200 302/302 [==============================] - 0s 182us/sample - loss: 11.0297 - mae: 2.4000 - val_loss: 16.3091 - val_mae: 2.7326 Epoch 32/200 302/302 [==============================] - 0s 181us/sample - loss: 11.6513 - mae: 2.4238 - val_loss: 17.5384 - val_mae: 3.3271 Epoch 33/200 302/302 [==============================] - 0s 181us/sample - loss: 11.1659 - mae: 2.3917 - val_loss: 16.1206 - val_mae: 2.7460 Epoch 34/200 302/302 [==============================] - 0s 182us/sample - loss: 10.4285 - mae: 2.2519 - val_loss: 16.3469 - val_mae: 2.9402 Epoch 35/200 302/302 [==============================] - 0s 193us/sample - loss: 12.4497 - mae: 2.6321 - val_loss: 17.5865 - val_mae: 3.0977 Epoch 36/200 302/302 [==============================] - 0s 187us/sample - loss: 10.5955 - mae: 2.3157 - val_loss: 14.3688 - val_mae: 2.5741 Epoch 37/200 302/302 [==============================] - 0s 185us/sample - loss: 10.7986 - mae: 2.2034 - val_loss: 16.1670 - val_mae: 2.8350 Epoch 38/200 302/302 [==============================] - 0s 185us/sample - loss: 9.8406 - mae: 2.2602 - val_loss: 26.7069 - val_mae: 3.5697 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 72us/sample - loss: 18.9605 - mae: 3.4174 [CV] END learning_rate=0.0036089914604343796, n_hidden=2, n_neurons=60; total time= 3.0s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 376.5225 - mae: 17.7211 - val_loss: 104.2779 - val_mae: 8.5103 Epoch 2/200 302/302 [==============================] - 0s 185us/sample - loss: 44.8740 - mae: 5.1683 - val_loss: 30.3457 - val_mae: 4.1451 Epoch 3/200 302/302 [==============================] - 0s 188us/sample - loss: 27.2047 - mae: 3.8986 - val_loss: 24.5703 - val_mae: 3.5598 Epoch 4/200 302/302 [==============================] - 0s 192us/sample - loss: 22.2181 - mae: 3.3978 - val_loss: 24.2210 - val_mae: 3.3958 Epoch 5/200 302/302 [==============================] - 0s 183us/sample - loss: 17.6025 - mae: 3.0640 - val_loss: 18.6728 - val_mae: 3.0046 Epoch 6/200 302/302 [==============================] - 0s 185us/sample - loss: 18.0190 - mae: 3.0138 - val_loss: 17.6173 - val_mae: 2.9579 Epoch 7/200 302/302 [==============================] - 0s 191us/sample - loss: 15.4164 - mae: 2.7418 - val_loss: 22.9016 - val_mae: 3.3060 Epoch 8/200 302/302 [==============================] - 0s 181us/sample - loss: 14.5700 - mae: 2.6733 - val_loss: 22.1414 - val_mae: 3.4912 Epoch 9/200 302/302 [==============================] - 0s 190us/sample - loss: 13.6840 - mae: 2.6458 - val_loss: 17.8639 - val_mae: 2.9553 Epoch 10/200 302/302 [==============================] - 0s 187us/sample - loss: 13.2668 - mae: 2.6915 - val_loss: 21.0871 - val_mae: 3.2900 Epoch 11/200 302/302 [==============================] - 0s 186us/sample - loss: 17.0444 - mae: 2.9033 - val_loss: 18.2747 - val_mae: 3.0626 Epoch 12/200 302/302 [==============================] - 0s 185us/sample - loss: 12.9125 - mae: 2.5613 - val_loss: 17.4228 - val_mae: 3.0185 Epoch 13/200 302/302 [==============================] - 0s 180us/sample - loss: 13.2611 - mae: 2.6087 - val_loss: 31.9715 - val_mae: 4.1052 Epoch 14/200 302/302 [==============================] - 0s 183us/sample - loss: 14.9915 - mae: 2.6943 - val_loss: 18.5600 - val_mae: 3.1574 Epoch 15/200 302/302 [==============================] - 0s 182us/sample - loss: 11.8108 - mae: 2.4540 - val_loss: 17.7929 - val_mae: 3.1861 Epoch 16/200 302/302 [==============================] - 0s 182us/sample - loss: 11.9788 - mae: 2.4937 - val_loss: 18.0740 - val_mae: 2.9364 Epoch 17/200 302/302 [==============================] - 0s 190us/sample - loss: 11.3639 - mae: 2.3514 - val_loss: 19.9639 - val_mae: 3.1644 Epoch 18/200 302/302 [==============================] - 0s 179us/sample - loss: 13.9905 - mae: 2.6707 - val_loss: 19.7152 - val_mae: 3.0300 Epoch 19/200 302/302 [==============================] - 0s 184us/sample - loss: 13.6611 - mae: 2.6066 - val_loss: 17.6015 - val_mae: 2.8848 Epoch 20/200 302/302 [==============================] - 0s 185us/sample - loss: 12.5805 - mae: 2.4947 - val_loss: 16.4853 - val_mae: 2.9829 Epoch 21/200 302/302 [==============================] - 0s 185us/sample - loss: 12.3634 - mae: 2.5300 - val_loss: 17.8936 - val_mae: 2.9946 Epoch 22/200 302/302 [==============================] - 0s 180us/sample - loss: 12.9805 - mae: 2.5408 - val_loss: 15.9184 - val_mae: 2.8270 Epoch 23/200 302/302 [==============================] - 0s 189us/sample - loss: 13.7353 - mae: 2.7322 - val_loss: 20.2707 - val_mae: 3.1181 Epoch 24/200 302/302 [==============================] - 0s 184us/sample - loss: 11.5228 - mae: 2.4009 - val_loss: 21.0150 - val_mae: 3.1696 Epoch 25/200 302/302 [==============================] - 0s 187us/sample - loss: 10.5532 - mae: 2.2392 - val_loss: 24.7245 - val_mae: 3.4356 Epoch 26/200 302/302 [==============================] - 0s 182us/sample - loss: 11.0197 - mae: 2.3631 - val_loss: 14.5230 - val_mae: 2.6460 Epoch 27/200 302/302 [==============================] - 0s 180us/sample - loss: 10.9522 - mae: 2.3224 - val_loss: 18.2912 - val_mae: 3.0150 Epoch 28/200 302/302 [==============================] - 0s 188us/sample - loss: 13.5827 - mae: 2.6171 - val_loss: 18.7617 - val_mae: 2.9980 Epoch 29/200 302/302 [==============================] - 0s 184us/sample - loss: 11.4945 - mae: 2.3962 - val_loss: 16.6695 - val_mae: 2.7177 Epoch 30/200 302/302 [==============================] - 0s 180us/sample - loss: 10.9510 - mae: 2.3342 - val_loss: 15.7330 - val_mae: 2.7407 Epoch 31/200 302/302 [==============================] - 0s 190us/sample - loss: 10.5168 - mae: 2.3142 - val_loss: 15.9764 - val_mae: 2.8158 Epoch 32/200 302/302 [==============================] - 0s 184us/sample - loss: 9.0396 - mae: 2.0834 - val_loss: 25.1308 - val_mae: 3.5934 Epoch 33/200 302/302 [==============================] - 0s 181us/sample - loss: 10.6365 - mae: 2.2207 - val_loss: 15.3747 - val_mae: 2.7679 Epoch 34/200 302/302 [==============================] - 0s 179us/sample - loss: 9.8851 - mae: 2.1373 - val_loss: 21.9643 - val_mae: 3.2488 Epoch 35/200 302/302 [==============================] - 0s 176us/sample - loss: 13.2118 - mae: 2.5526 - val_loss: 16.1445 - val_mae: 2.8305 Epoch 36/200 302/302 [==============================] - 0s 181us/sample - loss: 10.4111 - mae: 2.3056 - val_loss: 22.8405 - val_mae: 3.3199 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 67us/sample - loss: 14.6934 - mae: 3.0696 [CV] END learning_rate=0.0036089914604343796, n_hidden=2, n_neurons=60; total time= 2.9s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 3ms/sample - loss: 295.5006 - mae: 14.8343 - val_loss: 77.0021 - val_mae: 6.9056 Epoch 2/200 301/301 [==============================] - 0s 248us/sample - loss: 69.5536 - mae: 6.1131 - val_loss: 42.3043 - val_mae: 4.4413 Epoch 3/200 301/301 [==============================] - 0s 251us/sample - loss: 38.3594 - mae: 4.3525 - val_loss: 27.1117 - val_mae: 3.6859 Epoch 4/200 301/301 [==============================] - 0s 252us/sample - loss: 26.5548 - mae: 3.6107 - val_loss: 26.5916 - val_mae: 3.8055 Epoch 5/200 301/301 [==============================] - 0s 250us/sample - loss: 23.2611 - mae: 3.3693 - val_loss: 23.8180 - val_mae: 3.6211 Epoch 6/200 301/301 [==============================] - 0s 250us/sample - loss: 21.4377 - mae: 3.2110 - val_loss: 15.3207 - val_mae: 2.7651 Epoch 7/200 301/301 [==============================] - 0s 257us/sample - loss: 21.0075 - mae: 3.3708 - val_loss: 16.6926 - val_mae: 2.8362 Epoch 8/200 301/301 [==============================] - 0s 242us/sample - loss: 15.8210 - mae: 2.6846 - val_loss: 17.9221 - val_mae: 2.9086 Epoch 9/200 301/301 [==============================] - 0s 240us/sample - loss: 14.7815 - mae: 2.6733 - val_loss: 14.7355 - val_mae: 2.7900 Epoch 10/200 301/301 [==============================] - 0s 246us/sample - loss: 15.5547 - mae: 2.7276 - val_loss: 30.9565 - val_mae: 3.9390 Epoch 11/200 301/301 [==============================] - 0s 253us/sample - loss: 18.0461 - mae: 2.9995 - val_loss: 16.2820 - val_mae: 2.7945 Epoch 12/200 301/301 [==============================] - 0s 253us/sample - loss: 12.8710 - mae: 2.4301 - val_loss: 15.2846 - val_mae: 2.7480 Epoch 13/200 301/301 [==============================] - 0s 247us/sample - loss: 14.0655 - mae: 2.5599 - val_loss: 19.1532 - val_mae: 3.1882 Epoch 14/200 301/301 [==============================] - 0s 245us/sample - loss: 17.1784 - mae: 2.8960 - val_loss: 17.7086 - val_mae: 2.9190 Epoch 15/200 301/301 [==============================] - 0s 251us/sample - loss: 14.5069 - mae: 2.6732 - val_loss: 15.4453 - val_mae: 2.7812 Epoch 16/200 301/301 [==============================] - 0s 247us/sample - loss: 12.1104 - mae: 2.3637 - val_loss: 18.5575 - val_mae: 2.9257 Epoch 17/200 301/301 [==============================] - 0s 245us/sample - loss: 14.7439 - mae: 2.6238 - val_loss: 21.1339 - val_mae: 3.1744 Epoch 18/200 301/301 [==============================] - 0s 246us/sample - loss: 15.5618 - mae: 2.7833 - val_loss: 21.3171 - val_mae: 3.0000 Epoch 19/200 301/301 [==============================] - 0s 256us/sample - loss: 12.2944 - mae: 2.4097 - val_loss: 16.9392 - val_mae: 2.9933 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 89us/sample - loss: 20.2064 - mae: 3.0893 [CV] END learning_rate=0.0016340270885667296, n_hidden=3, n_neurons=78; total time= 2.3s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 3ms/sample - loss: 328.1662 - mae: 16.1819 - val_loss: 50.0605 - val_mae: 5.1679 Epoch 2/200 301/301 [==============================] - 0s 245us/sample - loss: 51.2630 - mae: 5.2149 - val_loss: 31.6859 - val_mae: 3.9789 Epoch 3/200 301/301 [==============================] - 0s 249us/sample - loss: 25.5272 - mae: 3.6592 - val_loss: 20.6727 - val_mae: 3.2504 Epoch 4/200 301/301 [==============================] - 0s 254us/sample - loss: 17.0761 - mae: 2.8888 - val_loss: 18.7495 - val_mae: 3.1590 Epoch 5/200 301/301 [==============================] - 0s 383us/sample - loss: 16.0178 - mae: 2.8485 - val_loss: 15.5862 - val_mae: 2.7659 Epoch 6/200 301/301 [==============================] - 0s 251us/sample - loss: 14.5551 - mae: 2.6680 - val_loss: 14.3786 - val_mae: 2.6246 Epoch 7/200 301/301 [==============================] - 0s 253us/sample - loss: 15.6979 - mae: 2.7706 - val_loss: 18.9469 - val_mae: 3.1019 Epoch 8/200 301/301 [==============================] - 0s 248us/sample - loss: 14.2982 - mae: 2.6120 - val_loss: 17.5704 - val_mae: 2.9727 Epoch 9/200 301/301 [==============================] - 0s 254us/sample - loss: 14.6219 - mae: 2.6394 - val_loss: 19.6871 - val_mae: 3.1492 Epoch 10/200 301/301 [==============================] - 0s 255us/sample - loss: 15.1526 - mae: 2.7522 - val_loss: 15.0724 - val_mae: 2.7037 Epoch 11/200 301/301 [==============================] - 0s 243us/sample - loss: 13.5982 - mae: 2.5097 - val_loss: 18.4277 - val_mae: 3.0687 Epoch 12/200 301/301 [==============================] - 0s 240us/sample - loss: 13.7405 - mae: 2.5667 - val_loss: 15.0187 - val_mae: 2.6292 Epoch 13/200 301/301 [==============================] - 0s 314us/sample - loss: 12.4658 - mae: 2.4200 - val_loss: 14.5685 - val_mae: 2.6656 Epoch 14/200 301/301 [==============================] - 0s 240us/sample - loss: 13.8623 - mae: 2.5760 - val_loss: 15.2007 - val_mae: 2.6135 Epoch 15/200 301/301 [==============================] - 0s 249us/sample - loss: 13.8917 - mae: 2.5761 - val_loss: 14.0524 - val_mae: 2.5874 Epoch 16/200 301/301 [==============================] - 0s 254us/sample - loss: 12.6220 - mae: 2.4606 - val_loss: 17.9530 - val_mae: 2.8946 Epoch 17/200 301/301 [==============================] - 0s 251us/sample - loss: 12.4400 - mae: 2.4284 - val_loss: 17.8498 - val_mae: 2.8798 Epoch 18/200 301/301 [==============================] - 0s 252us/sample - loss: 11.9055 - mae: 2.3189 - val_loss: 17.4737 - val_mae: 3.0125 Epoch 19/200 301/301 [==============================] - 0s 235us/sample - loss: 10.7617 - mae: 2.2472 - val_loss: 25.7157 - val_mae: 3.4892 Epoch 20/200 301/301 [==============================] - 0s 237us/sample - loss: 15.6382 - mae: 2.7207 - val_loss: 28.2612 - val_mae: 3.8891 Epoch 21/200 301/301 [==============================] - 0s 235us/sample - loss: 13.2111 - mae: 2.5168 - val_loss: 14.4071 - val_mae: 2.5898 Epoch 22/200 301/301 [==============================] - 0s 243us/sample - loss: 10.6844 - mae: 2.2333 - val_loss: 15.5967 - val_mae: 2.7177 Epoch 23/200 301/301 [==============================] - 0s 255us/sample - loss: 12.1291 - mae: 2.4481 - val_loss: 14.5080 - val_mae: 2.5600 Epoch 24/200 301/301 [==============================] - 0s 245us/sample - loss: 10.7104 - mae: 2.1922 - val_loss: 16.5720 - val_mae: 2.9388 Epoch 25/200 301/301 [==============================] - 0s 247us/sample - loss: 11.1168 - mae: 2.2510 - val_loss: 24.8988 - val_mae: 3.4719 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 83us/sample - loss: 17.4792 - mae: 3.3556 [CV] END learning_rate=0.0016340270885667296, n_hidden=3, n_neurons=78; total time= 2.8s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 321.0750 - mae: 15.8042 - val_loss: 67.7750 - val_mae: 6.5886 Epoch 2/200 302/302 [==============================] - 0s 242us/sample - loss: 45.1890 - mae: 5.1623 - val_loss: 27.8779 - val_mae: 3.8538 Epoch 3/200 302/302 [==============================] - 0s 244us/sample - loss: 23.6323 - mae: 3.4948 - val_loss: 36.5743 - val_mae: 4.2151 Epoch 4/200 302/302 [==============================] - 0s 241us/sample - loss: 19.8612 - mae: 3.0398 - val_loss: 17.1027 - val_mae: 3.0037 Epoch 5/200 302/302 [==============================] - 0s 245us/sample - loss: 18.9229 - mae: 3.0325 - val_loss: 16.3710 - val_mae: 2.9162 Epoch 6/200 302/302 [==============================] - 0s 235us/sample - loss: 14.8998 - mae: 2.6326 - val_loss: 21.5717 - val_mae: 3.3708 Epoch 7/200 302/302 [==============================] - 0s 246us/sample - loss: 14.6496 - mae: 2.7217 - val_loss: 19.4809 - val_mae: 3.1735 Epoch 8/200 302/302 [==============================] - 0s 252us/sample - loss: 14.5217 - mae: 2.7084 - val_loss: 23.4876 - val_mae: 3.2605 Epoch 9/200 302/302 [==============================] - 0s 254us/sample - loss: 16.5704 - mae: 2.9005 - val_loss: 19.2903 - val_mae: 3.1409 Epoch 10/200 302/302 [==============================] - 0s 251us/sample - loss: 15.0064 - mae: 2.7394 - val_loss: 18.2631 - val_mae: 3.0247 Epoch 11/200 302/302 [==============================] - 0s 254us/sample - loss: 12.9089 - mae: 2.4033 - val_loss: 17.4577 - val_mae: 2.9731 Epoch 12/200 302/302 [==============================] - 0s 256us/sample - loss: 14.0202 - mae: 2.6616 - val_loss: 15.6041 - val_mae: 2.9088 Epoch 13/200 302/302 [==============================] - 0s 244us/sample - loss: 13.9579 - mae: 2.5976 - val_loss: 17.6999 - val_mae: 3.2306 Epoch 14/200 302/302 [==============================] - 0s 250us/sample - loss: 12.8976 - mae: 2.6477 - val_loss: 14.8716 - val_mae: 2.6091 Epoch 15/200 302/302 [==============================] - 0s 246us/sample - loss: 12.0628 - mae: 2.3591 - val_loss: 14.2663 - val_mae: 2.4794 Epoch 16/200 302/302 [==============================] - 0s 247us/sample - loss: 14.1911 - mae: 2.7368 - val_loss: 17.2016 - val_mae: 3.0334 Epoch 17/200 302/302 [==============================] - 0s 249us/sample - loss: 11.9283 - mae: 2.3831 - val_loss: 15.1611 - val_mae: 2.7096 Epoch 18/200 302/302 [==============================] - 0s 250us/sample - loss: 11.2833 - mae: 2.2916 - val_loss: 13.6491 - val_mae: 2.6165 Epoch 19/200 302/302 [==============================] - 0s 246us/sample - loss: 11.4795 - mae: 2.3913 - val_loss: 18.5101 - val_mae: 3.0369 Epoch 20/200 302/302 [==============================] - 0s 253us/sample - loss: 13.6747 - mae: 2.5264 - val_loss: 13.9551 - val_mae: 2.5767 Epoch 21/200 302/302 [==============================] - 0s 245us/sample - loss: 11.0070 - mae: 2.2934 - val_loss: 13.6546 - val_mae: 2.4514 Epoch 22/200 302/302 [==============================] - 0s 250us/sample - loss: 11.2329 - mae: 2.2946 - val_loss: 14.3097 - val_mae: 2.7119 Epoch 23/200 302/302 [==============================] - 0s 248us/sample - loss: 11.8219 - mae: 2.4000 - val_loss: 15.5233 - val_mae: 2.7082 Epoch 24/200 302/302 [==============================] - 0s 248us/sample - loss: 11.4201 - mae: 2.3396 - val_loss: 14.1629 - val_mae: 2.4958 Epoch 25/200 302/302 [==============================] - 0s 249us/sample - loss: 11.8653 - mae: 2.3231 - val_loss: 13.9861 - val_mae: 2.5972 Epoch 26/200 302/302 [==============================] - 0s 251us/sample - loss: 10.0780 - mae: 2.1449 - val_loss: 13.7909 - val_mae: 2.4418 Epoch 27/200 302/302 [==============================] - 0s 255us/sample - loss: 10.5740 - mae: 2.1465 - val_loss: 12.9746 - val_mae: 2.3658 Epoch 28/200 302/302 [==============================] - 0s 237us/sample - loss: 9.9388 - mae: 2.0188 - val_loss: 16.1081 - val_mae: 2.7870 Epoch 29/200 302/302 [==============================] - 0s 239us/sample - loss: 11.4815 - mae: 2.3305 - val_loss: 15.5612 - val_mae: 2.7446 Epoch 30/200 302/302 [==============================] - 0s 246us/sample - loss: 11.3894 - mae: 2.3610 - val_loss: 14.9238 - val_mae: 2.6840 Epoch 31/200 302/302 [==============================] - 0s 244us/sample - loss: 10.6684 - mae: 2.2136 - val_loss: 14.0086 - val_mae: 2.5515 Epoch 32/200 302/302 [==============================] - 0s 247us/sample - loss: 9.9688 - mae: 2.0879 - val_loss: 15.4812 - val_mae: 2.7293 Epoch 33/200 302/302 [==============================] - 0s 246us/sample - loss: 10.3941 - mae: 2.2291 - val_loss: 14.0274 - val_mae: 2.5382 Epoch 34/200 302/302 [==============================] - 0s 250us/sample - loss: 9.7269 - mae: 2.1048 - val_loss: 14.4858 - val_mae: 2.4792 Epoch 35/200 302/302 [==============================] - 0s 250us/sample - loss: 10.1037 - mae: 2.0930 - val_loss: 16.1167 - val_mae: 2.8867 Epoch 36/200 302/302 [==============================] - 0s 248us/sample - loss: 10.9591 - mae: 2.2853 - val_loss: 16.3708 - val_mae: 2.8059 Epoch 37/200 302/302 [==============================] - 0s 243us/sample - loss: 11.0150 - mae: 2.2443 - val_loss: 14.0036 - val_mae: 2.5371 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 88us/sample - loss: 18.0899 - mae: 2.4814 [CV] END learning_rate=0.0016340270885667296, n_hidden=3, n_neurons=78; total time= 3.8s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 387.3312 - mae: 17.6232 - val_loss: 69.9310 - val_mae: 6.9068 Epoch 2/200 302/302 [==============================] - 0s 246us/sample - loss: 55.5427 - mae: 5.5872 - val_loss: 44.6851 - val_mae: 4.7115 Epoch 3/200 302/302 [==============================] - 0s 256us/sample - loss: 32.0749 - mae: 4.0233 - val_loss: 23.8219 - val_mae: 3.4218 Epoch 4/200 302/302 [==============================] - 0s 245us/sample - loss: 20.5607 - mae: 3.2374 - val_loss: 24.5779 - val_mae: 3.4065 Epoch 5/200 302/302 [==============================] - 0s 252us/sample - loss: 17.7348 - mae: 2.9286 - val_loss: 21.4273 - val_mae: 3.3078 Epoch 6/200 302/302 [==============================] - 0s 255us/sample - loss: 15.5977 - mae: 2.7431 - val_loss: 31.5687 - val_mae: 4.1651 Epoch 7/200 302/302 [==============================] - 0s 248us/sample - loss: 16.8664 - mae: 2.8183 - val_loss: 28.1417 - val_mae: 3.9310 Epoch 8/200 302/302 [==============================] - 0s 256us/sample - loss: 14.2264 - mae: 2.5802 - val_loss: 17.9047 - val_mae: 2.8208 Epoch 9/200 302/302 [==============================] - 0s 247us/sample - loss: 13.8350 - mae: 2.5421 - val_loss: 18.3364 - val_mae: 2.9030 Epoch 10/200 302/302 [==============================] - 0s 254us/sample - loss: 14.6248 - mae: 2.6146 - val_loss: 24.5315 - val_mae: 3.3769 Epoch 11/200 302/302 [==============================] - 0s 256us/sample - loss: 15.8520 - mae: 2.8102 - val_loss: 22.5154 - val_mae: 3.5717 Epoch 12/200 302/302 [==============================] - 0s 247us/sample - loss: 13.7526 - mae: 2.5805 - val_loss: 22.0197 - val_mae: 3.3186 Epoch 13/200 302/302 [==============================] - 0s 255us/sample - loss: 11.7225 - mae: 2.3922 - val_loss: 16.8695 - val_mae: 2.9056 Epoch 14/200 302/302 [==============================] - 0s 248us/sample - loss: 12.8544 - mae: 2.4441 - val_loss: 18.6210 - val_mae: 2.9382 Epoch 15/200 302/302 [==============================] - 0s 253us/sample - loss: 14.3781 - mae: 2.5717 - val_loss: 16.5524 - val_mae: 2.6786 Epoch 16/200 302/302 [==============================] - 0s 251us/sample - loss: 10.8834 - mae: 2.1418 - val_loss: 21.2663 - val_mae: 3.1827 Epoch 17/200 302/302 [==============================] - 0s 250us/sample - loss: 12.4919 - mae: 2.4812 - val_loss: 18.3460 - val_mae: 2.9451 Epoch 18/200 302/302 [==============================] - 0s 250us/sample - loss: 13.0659 - mae: 2.4540 - val_loss: 18.5134 - val_mae: 2.9877 Epoch 19/200 302/302 [==============================] - 0s 249us/sample - loss: 13.4066 - mae: 2.5065 - val_loss: 17.9685 - val_mae: 3.0146 Epoch 20/200 302/302 [==============================] - 0s 252us/sample - loss: 10.6158 - mae: 2.2183 - val_loss: 19.8333 - val_mae: 3.1375 Epoch 21/200 302/302 [==============================] - 0s 244us/sample - loss: 12.1327 - mae: 2.4380 - val_loss: 21.9379 - val_mae: 3.4276 Epoch 22/200 302/302 [==============================] - 0s 258us/sample - loss: 11.0688 - mae: 2.2551 - val_loss: 17.2469 - val_mae: 2.8983 Epoch 23/200 302/302 [==============================] - 0s 254us/sample - loss: 11.2505 - mae: 2.3170 - val_loss: 19.7475 - val_mae: 3.1335 Epoch 24/200 302/302 [==============================] - 0s 241us/sample - loss: 11.2631 - mae: 2.3570 - val_loss: 18.4253 - val_mae: 2.9170 Epoch 25/200 302/302 [==============================] - 0s 249us/sample - loss: 9.6144 - mae: 2.0555 - val_loss: 15.1482 - val_mae: 2.6448 Epoch 26/200 302/302 [==============================] - 0s 251us/sample - loss: 11.2993 - mae: 2.1844 - val_loss: 19.4148 - val_mae: 2.9679 Epoch 27/200 302/302 [==============================] - 0s 252us/sample - loss: 11.2188 - mae: 2.3101 - val_loss: 17.2890 - val_mae: 2.8374 Epoch 28/200 302/302 [==============================] - 0s 233us/sample - loss: 9.8712 - mae: 2.1096 - val_loss: 21.1094 - val_mae: 3.2327 Epoch 29/200 302/302 [==============================] - 0s 242us/sample - loss: 10.3692 - mae: 2.1917 - val_loss: 17.4192 - val_mae: 2.8782 Epoch 30/200 302/302 [==============================] - 0s 250us/sample - loss: 10.4161 - mae: 2.1129 - val_loss: 17.5893 - val_mae: 3.0659 Epoch 31/200 302/302 [==============================] - 0s 252us/sample - loss: 10.3537 - mae: 2.2253 - val_loss: 20.6705 - val_mae: 3.2254 Epoch 32/200 302/302 [==============================] - 0s 251us/sample - loss: 10.1019 - mae: 2.1657 - val_loss: 19.7501 - val_mae: 3.0133 Epoch 33/200 302/302 [==============================] - 0s 250us/sample - loss: 9.1897 - mae: 1.9726 - val_loss: 17.8137 - val_mae: 2.8811 Epoch 34/200 302/302 [==============================] - 0s 245us/sample - loss: 9.2531 - mae: 2.0506 - val_loss: 18.3549 - val_mae: 2.9095 Epoch 35/200 302/302 [==============================] - 0s 257us/sample - loss: 9.3147 - mae: 2.0099 - val_loss: 20.6121 - val_mae: 3.1565 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 86us/sample - loss: 12.4065 - mae: 3.0509 [CV] END learning_rate=0.0016340270885667296, n_hidden=3, n_neurons=78; total time= 3.5s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 3ms/sample - loss: 310.9401 - mae: 14.9283 - val_loss: 34.9179 - val_mae: 4.5208 Epoch 2/200 301/301 [==============================] - 0s 150us/sample - loss: 29.0205 - mae: 3.8956 - val_loss: 25.7375 - val_mae: 3.6426 Epoch 3/200 301/301 [==============================] - 0s 145us/sample - loss: 17.9084 - mae: 3.2007 - val_loss: 31.6755 - val_mae: 4.0231 Epoch 4/200 301/301 [==============================] - 0s 147us/sample - loss: 18.0462 - mae: 3.1933 - val_loss: 17.5185 - val_mae: 3.0551 Epoch 5/200 301/301 [==============================] - 0s 146us/sample - loss: 13.7633 - mae: 2.6735 - val_loss: 15.8004 - val_mae: 2.9340 Epoch 6/200 301/301 [==============================] - 0s 145us/sample - loss: 14.7230 - mae: 2.7528 - val_loss: 19.8525 - val_mae: 3.2408 Epoch 7/200 301/301 [==============================] - 0s 146us/sample - loss: 13.3573 - mae: 2.6655 - val_loss: 14.8865 - val_mae: 2.7659 Epoch 8/200 301/301 [==============================] - 0s 146us/sample - loss: 15.6916 - mae: 2.7945 - val_loss: 17.6501 - val_mae: 3.0416 Epoch 9/200 301/301 [==============================] - 0s 148us/sample - loss: 12.3384 - mae: 2.5341 - val_loss: 15.2290 - val_mae: 2.8207 Epoch 10/200 301/301 [==============================] - 0s 146us/sample - loss: 11.9566 - mae: 2.4718 - val_loss: 17.6497 - val_mae: 3.1259 Epoch 11/200 301/301 [==============================] - 0s 143us/sample - loss: 12.4262 - mae: 2.4776 - val_loss: 15.2727 - val_mae: 2.8337 Epoch 12/200 301/301 [==============================] - 0s 143us/sample - loss: 16.8016 - mae: 2.8828 - val_loss: 23.2906 - val_mae: 3.3189 Epoch 13/200 301/301 [==============================] - 0s 143us/sample - loss: 13.3256 - mae: 2.5257 - val_loss: 13.5702 - val_mae: 2.6885 Epoch 14/200 301/301 [==============================] - 0s 146us/sample - loss: 13.9370 - mae: 2.6185 - val_loss: 22.5561 - val_mae: 3.2269 Epoch 15/200 301/301 [==============================] - 0s 141us/sample - loss: 13.4254 - mae: 2.5418 - val_loss: 14.3607 - val_mae: 2.8176 Epoch 16/200 301/301 [==============================] - 0s 143us/sample - loss: 11.5137 - mae: 2.3992 - val_loss: 13.7355 - val_mae: 2.6178 Epoch 17/200 301/301 [==============================] - 0s 146us/sample - loss: 14.4840 - mae: 2.6526 - val_loss: 13.8217 - val_mae: 2.7150 Epoch 18/200 301/301 [==============================] - 0s 141us/sample - loss: 11.6412 - mae: 2.4813 - val_loss: 13.7596 - val_mae: 2.7738 Epoch 19/200 301/301 [==============================] - 0s 139us/sample - loss: 11.2520 - mae: 2.3541 - val_loss: 12.4630 - val_mae: 2.5611 Epoch 20/200 301/301 [==============================] - 0s 141us/sample - loss: 12.2822 - mae: 2.4855 - val_loss: 12.8337 - val_mae: 2.6468 Epoch 21/200 301/301 [==============================] - 0s 143us/sample - loss: 11.7555 - mae: 2.3895 - val_loss: 15.8158 - val_mae: 2.8528 Epoch 22/200 301/301 [==============================] - 0s 141us/sample - loss: 11.0166 - mae: 2.3016 - val_loss: 17.9279 - val_mae: 3.0444 Epoch 23/200 301/301 [==============================] - 0s 138us/sample - loss: 11.6682 - mae: 2.5001 - val_loss: 14.3507 - val_mae: 2.6641 Epoch 24/200 301/301 [==============================] - 0s 139us/sample - loss: 10.7239 - mae: 2.2900 - val_loss: 15.7911 - val_mae: 2.8765 Epoch 25/200 301/301 [==============================] - 0s 140us/sample - loss: 11.6126 - mae: 2.3397 - val_loss: 14.0517 - val_mae: 2.6396 Epoch 26/200 301/301 [==============================] - 0s 143us/sample - loss: 10.8714 - mae: 2.3935 - val_loss: 15.9797 - val_mae: 2.8861 Epoch 27/200 301/301 [==============================] - 0s 135us/sample - loss: 10.9625 - mae: 2.2896 - val_loss: 15.5485 - val_mae: 2.7746 Epoch 28/200 301/301 [==============================] - 0s 138us/sample - loss: 13.3896 - mae: 2.6273 - val_loss: 13.6917 - val_mae: 2.6521 Epoch 29/200 301/301 [==============================] - 0s 141us/sample - loss: 11.7861 - mae: 2.3854 - val_loss: 12.4374 - val_mae: 2.5302 Epoch 30/200 301/301 [==============================] - 0s 141us/sample - loss: 9.8861 - mae: 2.2224 - val_loss: 12.5524 - val_mae: 2.4904 Epoch 31/200 301/301 [==============================] - 0s 145us/sample - loss: 10.4225 - mae: 2.3235 - val_loss: 12.5980 - val_mae: 2.5456 Epoch 32/200 301/301 [==============================] - 0s 137us/sample - loss: 10.4607 - mae: 2.3057 - val_loss: 15.8809 - val_mae: 2.7079 Epoch 33/200 301/301 [==============================] - 0s 140us/sample - loss: 11.0527 - mae: 2.3575 - val_loss: 14.1699 - val_mae: 2.7244 Epoch 34/200 301/301 [==============================] - 0s 142us/sample - loss: 10.9455 - mae: 2.3374 - val_loss: 14.3407 - val_mae: 2.9132 Epoch 35/200 301/301 [==============================] - 0s 141us/sample - loss: 11.0976 - mae: 2.3895 - val_loss: 14.3681 - val_mae: 2.7167 Epoch 36/200 301/301 [==============================] - 0s 144us/sample - loss: 10.1244 - mae: 2.2814 - val_loss: 13.2445 - val_mae: 2.6381 Epoch 37/200 301/301 [==============================] - 0s 138us/sample - loss: 10.5915 - mae: 2.3031 - val_loss: 14.2894 - val_mae: 2.7357 Epoch 38/200 301/301 [==============================] - 0s 146us/sample - loss: 11.0164 - mae: 2.3915 - val_loss: 15.5844 - val_mae: 2.6521 Epoch 39/200 301/301 [==============================] - 0s 136us/sample - loss: 11.4423 - mae: 2.3367 - val_loss: 12.7142 - val_mae: 2.6132 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 46us/sample - loss: 13.0841 - mae: 2.3825 [CV] END learning_rate=0.008229570472317429, n_hidden=3, n_neurons=12; total time= 2.5s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 277.9269 - mae: 14.1666 - val_loss: 42.6984 - val_mae: 4.8702 Epoch 2/200 301/301 [==============================] - 0s 146us/sample - loss: 31.7883 - mae: 4.3119 - val_loss: 20.7905 - val_mae: 3.4990 Epoch 3/200 301/301 [==============================] - 0s 145us/sample - loss: 19.4920 - mae: 3.3838 - val_loss: 26.4497 - val_mae: 3.6971 Epoch 4/200 301/301 [==============================] - 0s 150us/sample - loss: 21.3041 - mae: 3.3353 - val_loss: 23.9795 - val_mae: 3.5936 Epoch 5/200 301/301 [==============================] - 0s 146us/sample - loss: 17.0176 - mae: 3.0881 - val_loss: 24.3949 - val_mae: 3.5127 Epoch 6/200 301/301 [==============================] - 0s 145us/sample - loss: 16.3866 - mae: 3.0155 - val_loss: 16.2933 - val_mae: 2.9803 Epoch 7/200 301/301 [==============================] - 0s 145us/sample - loss: 15.7345 - mae: 2.8820 - val_loss: 14.0909 - val_mae: 2.6078 Epoch 8/200 301/301 [==============================] - 0s 140us/sample - loss: 14.1817 - mae: 2.7256 - val_loss: 14.0107 - val_mae: 2.8813 Epoch 9/200 301/301 [==============================] - 0s 148us/sample - loss: 13.5062 - mae: 2.7138 - val_loss: 13.5181 - val_mae: 2.6190 Epoch 10/200 301/301 [==============================] - 0s 144us/sample - loss: 13.9494 - mae: 2.7371 - val_loss: 13.9943 - val_mae: 2.8537 Epoch 11/200 301/301 [==============================] - 0s 140us/sample - loss: 11.1853 - mae: 2.4372 - val_loss: 16.0198 - val_mae: 2.8876 Epoch 12/200 301/301 [==============================] - 0s 140us/sample - loss: 13.5422 - mae: 2.7158 - val_loss: 17.5391 - val_mae: 3.0754 Epoch 13/200 301/301 [==============================] - 0s 140us/sample - loss: 10.8382 - mae: 2.4324 - val_loss: 15.9992 - val_mae: 3.2459 Epoch 14/200 301/301 [==============================] - 0s 142us/sample - loss: 13.0455 - mae: 2.6600 - val_loss: 13.0444 - val_mae: 2.6576 Epoch 15/200 301/301 [==============================] - 0s 137us/sample - loss: 12.0255 - mae: 2.5547 - val_loss: 18.0274 - val_mae: 3.2783 Epoch 16/200 301/301 [==============================] - 0s 140us/sample - loss: 12.0258 - mae: 2.5129 - val_loss: 22.6131 - val_mae: 3.5043 Epoch 17/200 301/301 [==============================] - 0s 139us/sample - loss: 11.8165 - mae: 2.5132 - val_loss: 21.8442 - val_mae: 3.4205 Epoch 18/200 301/301 [==============================] - 0s 140us/sample - loss: 11.4771 - mae: 2.4665 - val_loss: 13.0342 - val_mae: 2.6432 Epoch 19/200 301/301 [==============================] - 0s 145us/sample - loss: 10.3533 - mae: 2.3841 - val_loss: 14.8923 - val_mae: 2.7541 Epoch 20/200 301/301 [==============================] - 0s 142us/sample - loss: 11.8316 - mae: 2.5523 - val_loss: 14.1302 - val_mae: 2.7313 Epoch 21/200 301/301 [==============================] - 0s 147us/sample - loss: 10.1486 - mae: 2.3498 - val_loss: 15.0938 - val_mae: 2.7571 Epoch 22/200 301/301 [==============================] - 0s 140us/sample - loss: 12.3510 - mae: 2.5880 - val_loss: 12.2996 - val_mae: 2.4999 Epoch 23/200 301/301 [==============================] - 0s 145us/sample - loss: 11.5149 - mae: 2.4904 - val_loss: 12.4455 - val_mae: 2.6345 Epoch 24/200 301/301 [==============================] - 0s 142us/sample - loss: 10.2946 - mae: 2.3675 - val_loss: 13.3475 - val_mae: 2.7657 Epoch 25/200 301/301 [==============================] - 0s 143us/sample - loss: 11.0830 - mae: 2.4413 - val_loss: 16.2100 - val_mae: 2.9002 Epoch 26/200 301/301 [==============================] - 0s 141us/sample - loss: 10.8866 - mae: 2.3743 - val_loss: 13.0230 - val_mae: 2.5811 Epoch 27/200 301/301 [==============================] - 0s 138us/sample - loss: 11.2918 - mae: 2.4647 - val_loss: 14.6421 - val_mae: 2.7329 Epoch 28/200 301/301 [==============================] - 0s 142us/sample - loss: 11.3130 - mae: 2.4254 - val_loss: 17.7586 - val_mae: 3.2128 Epoch 29/200 301/301 [==============================] - 0s 139us/sample - loss: 11.7669 - mae: 2.4797 - val_loss: 11.6454 - val_mae: 2.3744 Epoch 30/200 301/301 [==============================] - 0s 141us/sample - loss: 10.1861 - mae: 2.3615 - val_loss: 14.0380 - val_mae: 2.7139 Epoch 31/200 301/301 [==============================] - 0s 141us/sample - loss: 11.1781 - mae: 2.4847 - val_loss: 11.3433 - val_mae: 2.3595 Epoch 32/200 301/301 [==============================] - 0s 137us/sample - loss: 11.2940 - mae: 2.4885 - val_loss: 12.7291 - val_mae: 2.5560 Epoch 33/200 301/301 [==============================] - 0s 144us/sample - loss: 10.0542 - mae: 2.2805 - val_loss: 14.5145 - val_mae: 2.9702 Epoch 34/200 301/301 [==============================] - 0s 139us/sample - loss: 10.5245 - mae: 2.3904 - val_loss: 12.9567 - val_mae: 2.6665 Epoch 35/200 301/301 [==============================] - 0s 134us/sample - loss: 10.9214 - mae: 2.4168 - val_loss: 11.5693 - val_mae: 2.4758 Epoch 36/200 301/301 [==============================] - 0s 140us/sample - loss: 9.7687 - mae: 2.2635 - val_loss: 13.2589 - val_mae: 2.8163 Epoch 37/200 301/301 [==============================] - 0s 137us/sample - loss: 10.2896 - mae: 2.3344 - val_loss: 13.4506 - val_mae: 2.6484 Epoch 38/200 301/301 [==============================] - 0s 140us/sample - loss: 9.9028 - mae: 2.2936 - val_loss: 12.0838 - val_mae: 2.4882 Epoch 39/200 301/301 [==============================] - 0s 141us/sample - loss: 10.8216 - mae: 2.5113 - val_loss: 14.4469 - val_mae: 2.5374 Epoch 40/200 301/301 [==============================] - 0s 140us/sample - loss: 8.8260 - mae: 2.1418 - val_loss: 15.0461 - val_mae: 2.8675 Epoch 41/200 301/301 [==============================] - 0s 144us/sample - loss: 10.5579 - mae: 2.3880 - val_loss: 12.1761 - val_mae: 2.4644 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 43us/sample - loss: 11.0471 - mae: 2.4719 [CV] END learning_rate=0.008229570472317429, n_hidden=3, n_neurons=12; total time= 2.8s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 341.8764 - mae: 15.8478 - val_loss: 43.8066 - val_mae: 5.1969 Epoch 2/200 302/302 [==============================] - 0s 138us/sample - loss: 39.8119 - mae: 4.7547 - val_loss: 30.5611 - val_mae: 4.1037 Epoch 3/200 302/302 [==============================] - 0s 145us/sample - loss: 24.7970 - mae: 3.8192 - val_loss: 23.4342 - val_mae: 3.6744 Epoch 4/200 302/302 [==============================] - 0s 145us/sample - loss: 19.6069 - mae: 3.2558 - val_loss: 18.1275 - val_mae: 3.2502 Epoch 5/200 302/302 [==============================] - 0s 146us/sample - loss: 17.5170 - mae: 3.0945 - val_loss: 21.7002 - val_mae: 3.4132 Epoch 6/200 302/302 [==============================] - 0s 145us/sample - loss: 16.8962 - mae: 2.9185 - val_loss: 14.1989 - val_mae: 2.7548 Epoch 7/200 302/302 [==============================] - 0s 146us/sample - loss: 12.8342 - mae: 2.6037 - val_loss: 15.6290 - val_mae: 2.8610 Epoch 8/200 302/302 [==============================] - 0s 146us/sample - loss: 13.1387 - mae: 2.5916 - val_loss: 13.5028 - val_mae: 2.7286 Epoch 9/200 302/302 [==============================] - 0s 148us/sample - loss: 12.7637 - mae: 2.5913 - val_loss: 14.8943 - val_mae: 2.8868 Epoch 10/200 302/302 [==============================] - 0s 134us/sample - loss: 12.8083 - mae: 2.6834 - val_loss: 17.6200 - val_mae: 2.9600 Epoch 11/200 302/302 [==============================] - 0s 145us/sample - loss: 12.0658 - mae: 2.5517 - val_loss: 13.5063 - val_mae: 2.6876 Epoch 12/200 302/302 [==============================] - 0s 143us/sample - loss: 12.9325 - mae: 2.6301 - val_loss: 15.5088 - val_mae: 2.9252 Epoch 13/200 302/302 [==============================] - 0s 142us/sample - loss: 10.5812 - mae: 2.3608 - val_loss: 14.6235 - val_mae: 2.7658 Epoch 14/200 302/302 [==============================] - 0s 138us/sample - loss: 11.7932 - mae: 2.5308 - val_loss: 16.1179 - val_mae: 2.8420 Epoch 15/200 302/302 [==============================] - 0s 146us/sample - loss: 10.9278 - mae: 2.3619 - val_loss: 13.1425 - val_mae: 2.6296 Epoch 16/200 302/302 [==============================] - 0s 140us/sample - loss: 12.8270 - mae: 2.5832 - val_loss: 15.0722 - val_mae: 2.8061 Epoch 17/200 302/302 [==============================] - 0s 134us/sample - loss: 11.4353 - mae: 2.4413 - val_loss: 12.9153 - val_mae: 2.6124 Epoch 18/200 302/302 [==============================] - 0s 136us/sample - loss: 9.7308 - mae: 2.2257 - val_loss: 21.3066 - val_mae: 3.2575 Epoch 19/200 302/302 [==============================] - 0s 138us/sample - loss: 12.3271 - mae: 2.5830 - val_loss: 14.8490 - val_mae: 2.8609 Epoch 20/200 302/302 [==============================] - 0s 142us/sample - loss: 10.9758 - mae: 2.3396 - val_loss: 15.8330 - val_mae: 2.8765 Epoch 21/200 302/302 [==============================] - 0s 138us/sample - loss: 10.3303 - mae: 2.3263 - val_loss: 15.5786 - val_mae: 2.8065 Epoch 22/200 302/302 [==============================] - 0s 141us/sample - loss: 11.0813 - mae: 2.4141 - val_loss: 13.0436 - val_mae: 2.5995 Epoch 23/200 302/302 [==============================] - 0s 147us/sample - loss: 9.8636 - mae: 2.2037 - val_loss: 13.3039 - val_mae: 2.7496 Epoch 24/200 302/302 [==============================] - 0s 137us/sample - loss: 10.2578 - mae: 2.3194 - val_loss: 14.8128 - val_mae: 2.8267 Epoch 25/200 302/302 [==============================] - 0s 141us/sample - loss: 9.4456 - mae: 2.2700 - val_loss: 13.8110 - val_mae: 2.6860 Epoch 26/200 302/302 [==============================] - 0s 140us/sample - loss: 10.3895 - mae: 2.3775 - val_loss: 14.5209 - val_mae: 2.8044 Epoch 27/200 302/302 [==============================] - 0s 141us/sample - loss: 10.3341 - mae: 2.3265 - val_loss: 15.0677 - val_mae: 2.8407 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 48us/sample - loss: 15.7212 - mae: 2.5874 [CV] END learning_rate=0.008229570472317429, n_hidden=3, n_neurons=12; total time= 2.0s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 311.8687 - mae: 15.7074 - val_loss: 57.5546 - val_mae: 6.0691 Epoch 2/200 302/302 [==============================] - 0s 147us/sample - loss: 35.1642 - mae: 4.6895 - val_loss: 26.1977 - val_mae: 3.7847 Epoch 3/200 302/302 [==============================] - 0s 141us/sample - loss: 20.1123 - mae: 3.4321 - val_loss: 31.3358 - val_mae: 3.9375 Epoch 4/200 302/302 [==============================] - 0s 144us/sample - loss: 18.4082 - mae: 3.2392 - val_loss: 22.4236 - val_mae: 3.4019 Epoch 5/200 302/302 [==============================] - 0s 146us/sample - loss: 12.9268 - mae: 2.7000 - val_loss: 18.7101 - val_mae: 3.0889 Epoch 6/200 302/302 [==============================] - 0s 145us/sample - loss: 14.0137 - mae: 2.8027 - val_loss: 35.2253 - val_mae: 4.3193 Epoch 7/200 302/302 [==============================] - 0s 145us/sample - loss: 15.5807 - mae: 2.9772 - val_loss: 16.7613 - val_mae: 2.8661 Epoch 8/200 302/302 [==============================] - 0s 146us/sample - loss: 12.7268 - mae: 2.6461 - val_loss: 17.9650 - val_mae: 3.0347 Epoch 9/200 302/302 [==============================] - 0s 146us/sample - loss: 12.4173 - mae: 2.6500 - val_loss: 19.8698 - val_mae: 3.1655 Epoch 10/200 302/302 [==============================] - 0s 143us/sample - loss: 12.3479 - mae: 2.5714 - val_loss: 15.2742 - val_mae: 2.8915 Epoch 11/200 302/302 [==============================] - 0s 140us/sample - loss: 10.7820 - mae: 2.4381 - val_loss: 15.7561 - val_mae: 2.8810 Epoch 12/200 302/302 [==============================] - 0s 143us/sample - loss: 10.4685 - mae: 2.3863 - val_loss: 14.5989 - val_mae: 2.6965 Epoch 13/200 302/302 [==============================] - 0s 140us/sample - loss: 11.5256 - mae: 2.4964 - val_loss: 15.7252 - val_mae: 2.7944 Epoch 14/200 302/302 [==============================] - 0s 142us/sample - loss: 10.1971 - mae: 2.3544 - val_loss: 14.5490 - val_mae: 2.6510 Epoch 15/200 302/302 [==============================] - 0s 140us/sample - loss: 10.1651 - mae: 2.3189 - val_loss: 16.6646 - val_mae: 2.9321 Epoch 16/200 302/302 [==============================] - 0s 137us/sample - loss: 10.9195 - mae: 2.3637 - val_loss: 16.0315 - val_mae: 2.8220 Epoch 17/200 302/302 [==============================] - 0s 132us/sample - loss: 12.5219 - mae: 2.5517 - val_loss: 15.5123 - val_mae: 2.7828 Epoch 18/200 302/302 [==============================] - 0s 137us/sample - loss: 10.7089 - mae: 2.4729 - val_loss: 15.3946 - val_mae: 2.9173 Epoch 19/200 302/302 [==============================] - 0s 132us/sample - loss: 9.4286 - mae: 2.1874 - val_loss: 17.6533 - val_mae: 2.9656 Epoch 20/200 302/302 [==============================] - 0s 135us/sample - loss: 10.3941 - mae: 2.2776 - val_loss: 14.6038 - val_mae: 2.8340 Epoch 21/200 302/302 [==============================] - 0s 137us/sample - loss: 10.2497 - mae: 2.2974 - val_loss: 14.9227 - val_mae: 2.6954 Epoch 22/200 302/302 [==============================] - 0s 141us/sample - loss: 9.7979 - mae: 2.2040 - val_loss: 14.3230 - val_mae: 2.5939 Epoch 23/200 302/302 [==============================] - 0s 132us/sample - loss: 10.9268 - mae: 2.2960 - val_loss: 18.1655 - val_mae: 3.1237 Epoch 24/200 302/302 [==============================] - 0s 140us/sample - loss: 11.8198 - mae: 2.5133 - val_loss: 15.4884 - val_mae: 2.7448 Epoch 25/200 302/302 [==============================] - 0s 133us/sample - loss: 10.5334 - mae: 2.4210 - val_loss: 13.8456 - val_mae: 2.5530 Epoch 26/200 302/302 [==============================] - 0s 138us/sample - loss: 9.4885 - mae: 2.1508 - val_loss: 15.5708 - val_mae: 2.7530 Epoch 27/200 302/302 [==============================] - 0s 138us/sample - loss: 9.3994 - mae: 2.1737 - val_loss: 14.6580 - val_mae: 2.6548 Epoch 28/200 302/302 [==============================] - 0s 136us/sample - loss: 9.7686 - mae: 2.1926 - val_loss: 15.1978 - val_mae: 2.8385 Epoch 29/200 302/302 [==============================] - 0s 135us/sample - loss: 9.3518 - mae: 2.2363 - val_loss: 14.5479 - val_mae: 2.7584 Epoch 30/200 302/302 [==============================] - 0s 137us/sample - loss: 9.7786 - mae: 2.2320 - val_loss: 18.6593 - val_mae: 2.9683 Epoch 31/200 302/302 [==============================] - 0s 131us/sample - loss: 9.5651 - mae: 2.1977 - val_loss: 13.5799 - val_mae: 2.5835 Epoch 32/200 302/302 [==============================] - 0s 139us/sample - loss: 8.7340 - mae: 2.1147 - val_loss: 15.5215 - val_mae: 2.8841 Epoch 33/200 302/302 [==============================] - 0s 138us/sample - loss: 11.2128 - mae: 2.3972 - val_loss: 14.2715 - val_mae: 2.6820 Epoch 34/200 302/302 [==============================] - 0s 135us/sample - loss: 10.1783 - mae: 2.2935 - val_loss: 14.5585 - val_mae: 2.7862 Epoch 35/200 302/302 [==============================] - 0s 136us/sample - loss: 8.7816 - mae: 2.0930 - val_loss: 15.3892 - val_mae: 2.8418 Epoch 36/200 302/302 [==============================] - 0s 138us/sample - loss: 11.4370 - mae: 2.3576 - val_loss: 16.5220 - val_mae: 2.9974 Epoch 37/200 302/302 [==============================] - 0s 130us/sample - loss: 11.5854 - mae: 2.4504 - val_loss: 14.8909 - val_mae: 2.7999 Epoch 38/200 302/302 [==============================] - 0s 139us/sample - loss: 8.6083 - mae: 2.0010 - val_loss: 14.3134 - val_mae: 2.7688 Epoch 39/200 302/302 [==============================] - 0s 135us/sample - loss: 9.3827 - mae: 2.2097 - val_loss: 15.2978 - val_mae: 2.9070 Epoch 40/200 302/302 [==============================] - 0s 138us/sample - loss: 9.8458 - mae: 2.2405 - val_loss: 14.6123 - val_mae: 2.7649 Epoch 41/200 302/302 [==============================] - 0s 136us/sample - loss: 9.7670 - mae: 2.2182 - val_loss: 16.7879 - val_mae: 2.8640 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 43us/sample - loss: 9.4418 - mae: 2.8033 [CV] END learning_rate=0.008229570472317429, n_hidden=3, n_neurons=12; total time= 2.8s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 488.7058 - mae: 20.5788 - val_loss: 271.0241 - val_mae: 15.5795 Epoch 2/200 301/301 [==============================] - 0s 128us/sample - loss: 162.7584 - mae: 11.1282 - val_loss: 71.6791 - val_mae: 7.1091 Epoch 3/200 301/301 [==============================] - 0s 131us/sample - loss: 59.7698 - mae: 6.0812 - val_loss: 43.6607 - val_mae: 5.1030 Epoch 4/200 301/301 [==============================] - 0s 126us/sample - loss: 42.5943 - mae: 4.9666 - val_loss: 34.0950 - val_mae: 4.4594 Epoch 5/200 301/301 [==============================] - 0s 135us/sample - loss: 31.7580 - mae: 4.3226 - val_loss: 27.8267 - val_mae: 3.9594 Epoch 6/200 301/301 [==============================] - 0s 127us/sample - loss: 25.9222 - mae: 3.8153 - val_loss: 23.1034 - val_mae: 3.6272 Epoch 7/200 301/301 [==============================] - 0s 136us/sample - loss: 20.8419 - mae: 3.4229 - val_loss: 24.7847 - val_mae: 3.7393 Epoch 8/200 301/301 [==============================] - 0s 135us/sample - loss: 19.1017 - mae: 3.2413 - val_loss: 18.0373 - val_mae: 3.1597 Epoch 9/200 301/301 [==============================] - 0s 135us/sample - loss: 16.2628 - mae: 2.9358 - val_loss: 16.5580 - val_mae: 2.9955 Epoch 10/200 301/301 [==============================] - 0s 135us/sample - loss: 14.7133 - mae: 2.7923 - val_loss: 16.1927 - val_mae: 2.9281 Epoch 11/200 301/301 [==============================] - 0s 136us/sample - loss: 14.6141 - mae: 2.7358 - val_loss: 16.6571 - val_mae: 3.0297 Epoch 12/200 301/301 [==============================] - 0s 134us/sample - loss: 14.0042 - mae: 2.6973 - val_loss: 14.8306 - val_mae: 2.8060 Epoch 13/200 301/301 [==============================] - 0s 132us/sample - loss: 13.0064 - mae: 2.5940 - val_loss: 14.6436 - val_mae: 2.7766 Epoch 14/200 301/301 [==============================] - 0s 131us/sample - loss: 12.5832 - mae: 2.5601 - val_loss: 14.6364 - val_mae: 2.7469 Epoch 15/200 301/301 [==============================] - 0s 123us/sample - loss: 14.0097 - mae: 2.7117 - val_loss: 14.2881 - val_mae: 2.7552 Epoch 16/200 301/301 [==============================] - 0s 131us/sample - loss: 12.0833 - mae: 2.5356 - val_loss: 14.1343 - val_mae: 2.7963 Epoch 17/200 301/301 [==============================] - 0s 127us/sample - loss: 13.2436 - mae: 2.5927 - val_loss: 15.0820 - val_mae: 2.8695 Epoch 18/200 301/301 [==============================] - 0s 137us/sample - loss: 12.1391 - mae: 2.5011 - val_loss: 14.8220 - val_mae: 2.8632 Epoch 19/200 301/301 [==============================] - 0s 130us/sample - loss: 12.8016 - mae: 2.5899 - val_loss: 14.4271 - val_mae: 2.8213 Epoch 20/200 301/301 [==============================] - 0s 138us/sample - loss: 12.8329 - mae: 2.5646 - val_loss: 14.4259 - val_mae: 2.7727 Epoch 21/200 301/301 [==============================] - 0s 131us/sample - loss: 12.0487 - mae: 2.4993 - val_loss: 13.1549 - val_mae: 2.6478 Epoch 22/200 301/301 [==============================] - 0s 132us/sample - loss: 11.7168 - mae: 2.4409 - val_loss: 13.9557 - val_mae: 2.7116 Epoch 23/200 301/301 [==============================] - 0s 130us/sample - loss: 12.3216 - mae: 2.5070 - val_loss: 16.1768 - val_mae: 2.9828 Epoch 24/200 301/301 [==============================] - 0s 129us/sample - loss: 12.0690 - mae: 2.5322 - val_loss: 13.5149 - val_mae: 2.6966 Epoch 25/200 301/301 [==============================] - 0s 132us/sample - loss: 11.8566 - mae: 2.4700 - val_loss: 13.8749 - val_mae: 2.6947 Epoch 26/200 301/301 [==============================] - 0s 131us/sample - loss: 12.3646 - mae: 2.4883 - val_loss: 13.3905 - val_mae: 2.7116 Epoch 27/200 301/301 [==============================] - 0s 125us/sample - loss: 11.4051 - mae: 2.4065 - val_loss: 13.8290 - val_mae: 2.7362 Epoch 28/200 301/301 [==============================] - 0s 129us/sample - loss: 11.9496 - mae: 2.4342 - val_loss: 15.9446 - val_mae: 2.9648 Epoch 29/200 301/301 [==============================] - 0s 125us/sample - loss: 11.5715 - mae: 2.4053 - val_loss: 13.6811 - val_mae: 2.7176 Epoch 30/200 301/301 [==============================] - 0s 132us/sample - loss: 11.2902 - mae: 2.4453 - val_loss: 13.8350 - val_mae: 2.7630 Epoch 31/200 301/301 [==============================] - 0s 130us/sample - loss: 11.4556 - mae: 2.4092 - val_loss: 13.8239 - val_mae: 2.7735 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 40us/sample - loss: 12.2205 - mae: 2.3800 [CV] END learning_rate=0.005209877236352987, n_hidden=2, n_neurons=9; total time= 1.9s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 496.0431 - mae: 20.7871 - val_loss: 304.3033 - val_mae: 16.6339 Epoch 2/200 301/301 [==============================] - 0s 130us/sample - loss: 216.0430 - mae: 13.3457 - val_loss: 96.0687 - val_mae: 8.7412 Epoch 3/200 301/301 [==============================] - 0s 145us/sample - loss: 56.6173 - mae: 6.0611 - val_loss: 23.2727 - val_mae: 3.4941 Epoch 4/200 301/301 [==============================] - 0s 135us/sample - loss: 22.0912 - mae: 3.4575 - val_loss: 21.0456 - val_mae: 3.2361 Epoch 5/200 301/301 [==============================] - 0s 141us/sample - loss: 19.3588 - mae: 3.1487 - val_loss: 21.3510 - val_mae: 3.2265 Epoch 6/200 301/301 [==============================] - 0s 134us/sample - loss: 19.7854 - mae: 3.2634 - val_loss: 18.4889 - val_mae: 3.1429 Epoch 7/200 301/301 [==============================] - 0s 140us/sample - loss: 17.1643 - mae: 2.9941 - val_loss: 20.9754 - val_mae: 3.2664 Epoch 8/200 301/301 [==============================] - 0s 131us/sample - loss: 18.3441 - mae: 3.1403 - val_loss: 19.8602 - val_mae: 3.2218 Epoch 9/200 301/301 [==============================] - 0s 135us/sample - loss: 18.9556 - mae: 3.2444 - val_loss: 17.9008 - val_mae: 3.0390 Epoch 10/200 301/301 [==============================] - 0s 131us/sample - loss: 16.1282 - mae: 2.8902 - val_loss: 16.2701 - val_mae: 2.9145 Epoch 11/200 301/301 [==============================] - 0s 135us/sample - loss: 14.6101 - mae: 2.7613 - val_loss: 16.8553 - val_mae: 3.0864 Epoch 12/200 301/301 [==============================] - 0s 129us/sample - loss: 14.8515 - mae: 2.8213 - val_loss: 20.5938 - val_mae: 3.1868 Epoch 13/200 301/301 [==============================] - 0s 132us/sample - loss: 14.1161 - mae: 2.7698 - val_loss: 15.2170 - val_mae: 2.7257 Epoch 14/200 301/301 [==============================] - 0s 137us/sample - loss: 14.7960 - mae: 2.8048 - val_loss: 15.3039 - val_mae: 2.8347 Epoch 15/200 301/301 [==============================] - 0s 128us/sample - loss: 14.0400 - mae: 2.7129 - val_loss: 20.1800 - val_mae: 3.2261 Epoch 16/200 301/301 [==============================] - 0s 133us/sample - loss: 13.9194 - mae: 2.7093 - val_loss: 15.2252 - val_mae: 2.8001 Epoch 17/200 301/301 [==============================] - 0s 132us/sample - loss: 12.1565 - mae: 2.5893 - val_loss: 14.1922 - val_mae: 2.7533 Epoch 18/200 301/301 [==============================] - 0s 132us/sample - loss: 12.1137 - mae: 2.5538 - val_loss: 15.0572 - val_mae: 2.6659 Epoch 19/200 301/301 [==============================] - 0s 124us/sample - loss: 11.7396 - mae: 2.5162 - val_loss: 14.0175 - val_mae: 2.6784 Epoch 20/200 301/301 [==============================] - 0s 136us/sample - loss: 11.7802 - mae: 2.5106 - val_loss: 14.4080 - val_mae: 2.8197 Epoch 21/200 301/301 [==============================] - 0s 135us/sample - loss: 11.2963 - mae: 2.4498 - val_loss: 18.9606 - val_mae: 3.2791 Epoch 22/200 301/301 [==============================] - 0s 130us/sample - loss: 10.9530 - mae: 2.4427 - val_loss: 13.4346 - val_mae: 2.5215 Epoch 23/200 301/301 [==============================] - 0s 131us/sample - loss: 10.7723 - mae: 2.3878 - val_loss: 13.7523 - val_mae: 2.6711 Epoch 24/200 301/301 [==============================] - 0s 131us/sample - loss: 10.9874 - mae: 2.4439 - val_loss: 13.5511 - val_mae: 2.6625 Epoch 25/200 301/301 [==============================] - 0s 131us/sample - loss: 10.6563 - mae: 2.4116 - val_loss: 13.7108 - val_mae: 2.6493 Epoch 26/200 301/301 [==============================] - 0s 131us/sample - loss: 10.8480 - mae: 2.4428 - val_loss: 14.0872 - val_mae: 2.6929 Epoch 27/200 301/301 [==============================] - 0s 132us/sample - loss: 10.3733 - mae: 2.2705 - val_loss: 13.0872 - val_mae: 2.5150 Epoch 28/200 301/301 [==============================] - 0s 137us/sample - loss: 10.7420 - mae: 2.3805 - val_loss: 12.7025 - val_mae: 2.4599 Epoch 29/200 301/301 [==============================] - 0s 126us/sample - loss: 11.9388 - mae: 2.5105 - val_loss: 20.3321 - val_mae: 3.1611 Epoch 30/200 301/301 [==============================] - 0s 137us/sample - loss: 13.3582 - mae: 2.6128 - val_loss: 13.5079 - val_mae: 2.6966 Epoch 31/200 301/301 [==============================] - 0s 136us/sample - loss: 11.8217 - mae: 2.4742 - val_loss: 13.4874 - val_mae: 2.6275 Epoch 32/200 301/301 [==============================] - 0s 129us/sample - loss: 10.3577 - mae: 2.3524 - val_loss: 16.2440 - val_mae: 2.7832 Epoch 33/200 301/301 [==============================] - 0s 134us/sample - loss: 11.1252 - mae: 2.4682 - val_loss: 13.6468 - val_mae: 2.6526 Epoch 34/200 301/301 [==============================] - 0s 126us/sample - loss: 10.1908 - mae: 2.3164 - val_loss: 14.4666 - val_mae: 2.6778 Epoch 35/200 301/301 [==============================] - 0s 130us/sample - loss: 10.7607 - mae: 2.3746 - val_loss: 13.7074 - val_mae: 2.5844 Epoch 36/200 301/301 [==============================] - 0s 135us/sample - loss: 10.6407 - mae: 2.3474 - val_loss: 16.5581 - val_mae: 2.8117 Epoch 37/200 301/301 [==============================] - 0s 132us/sample - loss: 10.9817 - mae: 2.3824 - val_loss: 12.6993 - val_mae: 2.5151 Epoch 38/200 301/301 [==============================] - 0s 131us/sample - loss: 11.2730 - mae: 2.4351 - val_loss: 16.3813 - val_mae: 2.8374 Epoch 39/200 301/301 [==============================] - 0s 133us/sample - loss: 10.5810 - mae: 2.3609 - val_loss: 12.7336 - val_mae: 2.5495 Epoch 40/200 301/301 [==============================] - 0s 133us/sample - loss: 10.9527 - mae: 2.3689 - val_loss: 13.4339 - val_mae: 2.6769 Epoch 41/200 301/301 [==============================] - 0s 134us/sample - loss: 10.8704 - mae: 2.3765 - val_loss: 16.9281 - val_mae: 2.9492 Epoch 42/200 301/301 [==============================] - 0s 130us/sample - loss: 12.1397 - mae: 2.5185 - val_loss: 12.9731 - val_mae: 2.5793 Epoch 43/200 301/301 [==============================] - 0s 137us/sample - loss: 9.7682 - mae: 2.2350 - val_loss: 12.9953 - val_mae: 2.5853 Epoch 44/200 301/301 [==============================] - 0s 129us/sample - loss: 10.9419 - mae: 2.3934 - val_loss: 13.1875 - val_mae: 2.4999 Epoch 45/200 301/301 [==============================] - 0s 137us/sample - loss: 10.1571 - mae: 2.3002 - val_loss: 13.1657 - val_mae: 2.5536 Epoch 46/200 301/301 [==============================] - 0s 133us/sample - loss: 9.6557 - mae: 2.2082 - val_loss: 14.1381 - val_mae: 2.7538 Epoch 47/200 301/301 [==============================] - 0s 131us/sample - loss: 10.9000 - mae: 2.3713 - val_loss: 12.5616 - val_mae: 2.4872 Epoch 48/200 301/301 [==============================] - 0s 132us/sample - loss: 10.2378 - mae: 2.3585 - val_loss: 12.6832 - val_mae: 2.5561 Epoch 49/200 301/301 [==============================] - 0s 126us/sample - loss: 9.8171 - mae: 2.3220 - val_loss: 12.4174 - val_mae: 2.4491 Epoch 50/200 301/301 [==============================] - 0s 133us/sample - loss: 10.9932 - mae: 2.3968 - val_loss: 17.7489 - val_mae: 3.0185 Epoch 51/200 301/301 [==============================] - 0s 129us/sample - loss: 10.5372 - mae: 2.2940 - val_loss: 13.0316 - val_mae: 2.5280 Epoch 52/200 301/301 [==============================] - 0s 131us/sample - loss: 9.7106 - mae: 2.3125 - val_loss: 14.3320 - val_mae: 2.6267 Epoch 53/200 301/301 [==============================] - 0s 131us/sample - loss: 9.9388 - mae: 2.2778 - val_loss: 12.9621 - val_mae: 2.5461 Epoch 54/200 301/301 [==============================] - 0s 132us/sample - loss: 10.2970 - mae: 2.3897 - val_loss: 13.1984 - val_mae: 2.5781 Epoch 55/200 301/301 [==============================] - 0s 134us/sample - loss: 12.1286 - mae: 2.3886 - val_loss: 12.9901 - val_mae: 2.5112 Epoch 56/200 301/301 [==============================] - 0s 134us/sample - loss: 9.6718 - mae: 2.2806 - val_loss: 12.6425 - val_mae: 2.4523 Epoch 57/200 301/301 [==============================] - 0s 138us/sample - loss: 10.0643 - mae: 2.3044 - val_loss: 12.7923 - val_mae: 2.4947 Epoch 58/200 301/301 [==============================] - 0s 129us/sample - loss: 10.3791 - mae: 2.3193 - val_loss: 12.9558 - val_mae: 2.5438 Epoch 59/200 301/301 [==============================] - 0s 131us/sample - loss: 10.1000 - mae: 2.2736 - val_loss: 13.6749 - val_mae: 2.6875 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 41us/sample - loss: 10.4752 - mae: 2.6177 [CV] END learning_rate=0.005209877236352987, n_hidden=2, n_neurons=9; total time= 3.1s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 446.9772 - mae: 19.8550 - val_loss: 224.5972 - val_mae: 13.9119 Epoch 2/200 302/302 [==============================] - 0s 132us/sample - loss: 150.0480 - mae: 10.9486 - val_loss: 57.5882 - val_mae: 6.3457 Epoch 3/200 302/302 [==============================] - 0s 130us/sample - loss: 45.0800 - mae: 5.2873 - val_loss: 27.7199 - val_mae: 3.8409 Epoch 4/200 302/302 [==============================] - 0s 126us/sample - loss: 26.4715 - mae: 3.8469 - val_loss: 20.6040 - val_mae: 3.3155 Epoch 5/200 302/302 [==============================] - 0s 133us/sample - loss: 18.9137 - mae: 3.1320 - val_loss: 19.7964 - val_mae: 3.2213 Epoch 6/200 302/302 [==============================] - 0s 132us/sample - loss: 16.6536 - mae: 2.9840 - val_loss: 21.1056 - val_mae: 3.3543 Epoch 7/200 302/302 [==============================] - 0s 136us/sample - loss: 15.7414 - mae: 2.8752 - val_loss: 17.5256 - val_mae: 3.0416 Epoch 8/200 302/302 [==============================] - 0s 132us/sample - loss: 14.1804 - mae: 2.7507 - val_loss: 15.9328 - val_mae: 2.9831 Epoch 9/200 302/302 [==============================] - 0s 135us/sample - loss: 13.3589 - mae: 2.6402 - val_loss: 20.3852 - val_mae: 3.2529 Epoch 10/200 302/302 [==============================] - 0s 133us/sample - loss: 13.3271 - mae: 2.7243 - val_loss: 15.8960 - val_mae: 2.9634 Epoch 11/200 302/302 [==============================] - 0s 130us/sample - loss: 12.4198 - mae: 2.5723 - val_loss: 14.2413 - val_mae: 2.8161 Epoch 12/200 302/302 [==============================] - 0s 136us/sample - loss: 13.9848 - mae: 2.7644 - val_loss: 15.1141 - val_mae: 2.9399 Epoch 13/200 302/302 [==============================] - 0s 124us/sample - loss: 12.6465 - mae: 2.6687 - val_loss: 14.0675 - val_mae: 2.7324 Epoch 14/200 302/302 [==============================] - 0s 131us/sample - loss: 11.9259 - mae: 2.5227 - val_loss: 14.7098 - val_mae: 2.8439 Epoch 15/200 302/302 [==============================] - 0s 128us/sample - loss: 12.4652 - mae: 2.5505 - val_loss: 14.1805 - val_mae: 2.7148 Epoch 16/200 302/302 [==============================] - 0s 134us/sample - loss: 11.7857 - mae: 2.5062 - val_loss: 13.4282 - val_mae: 2.6544 Epoch 17/200 302/302 [==============================] - 0s 134us/sample - loss: 11.8389 - mae: 2.5376 - val_loss: 15.8189 - val_mae: 2.8597 Epoch 18/200 302/302 [==============================] - 0s 133us/sample - loss: 11.1496 - mae: 2.4520 - val_loss: 13.3433 - val_mae: 2.7689 Epoch 19/200 302/302 [==============================] - 0s 133us/sample - loss: 11.8900 - mae: 2.4880 - val_loss: 14.0033 - val_mae: 2.7294 Epoch 20/200 302/302 [==============================] - 0s 129us/sample - loss: 11.3057 - mae: 2.4436 - val_loss: 12.9676 - val_mae: 2.6524 Epoch 21/200 302/302 [==============================] - 0s 132us/sample - loss: 11.2361 - mae: 2.4417 - val_loss: 15.1624 - val_mae: 2.7717 Epoch 22/200 302/302 [==============================] - 0s 130us/sample - loss: 12.5499 - mae: 2.5849 - val_loss: 14.3573 - val_mae: 2.7339 Epoch 23/200 302/302 [==============================] - 0s 134us/sample - loss: 11.2212 - mae: 2.4248 - val_loss: 15.2440 - val_mae: 2.8319 Epoch 24/200 302/302 [==============================] - 0s 132us/sample - loss: 10.5930 - mae: 2.3580 - val_loss: 13.6413 - val_mae: 2.8065 Epoch 25/200 302/302 [==============================] - 0s 132us/sample - loss: 11.2978 - mae: 2.5071 - val_loss: 13.6567 - val_mae: 2.6635 Epoch 26/200 302/302 [==============================] - 0s 130us/sample - loss: 10.8641 - mae: 2.3470 - val_loss: 18.1522 - val_mae: 3.0437 Epoch 27/200 302/302 [==============================] - 0s 132us/sample - loss: 11.7278 - mae: 2.5070 - val_loss: 13.5552 - val_mae: 2.7079 Epoch 28/200 302/302 [==============================] - 0s 131us/sample - loss: 10.4398 - mae: 2.3541 - val_loss: 13.8010 - val_mae: 2.6582 Epoch 29/200 302/302 [==============================] - 0s 138us/sample - loss: 10.7097 - mae: 2.4109 - val_loss: 13.0086 - val_mae: 2.6922 Epoch 30/200 302/302 [==============================] - 0s 131us/sample - loss: 10.2958 - mae: 2.3815 - val_loss: 13.9527 - val_mae: 2.7952 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 41us/sample - loss: 12.7511 - mae: 2.5493 [CV] END learning_rate=0.005209877236352987, n_hidden=2, n_neurons=9; total time= 1.9s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 517.6192 - mae: 20.9708 - val_loss: 327.5756 - val_mae: 17.1359 Epoch 2/200 302/302 [==============================] - 0s 135us/sample - loss: 235.1104 - mae: 13.6024 - val_loss: 121.0261 - val_mae: 9.7148 Epoch 3/200 302/302 [==============================] - 0s 132us/sample - loss: 82.3104 - mae: 7.7972 - val_loss: 28.8914 - val_mae: 4.2838 Epoch 4/200 302/302 [==============================] - 0s 134us/sample - loss: 21.5756 - mae: 3.6649 - val_loss: 21.3627 - val_mae: 3.5312 Epoch 5/200 302/302 [==============================] - 0s 134us/sample - loss: 16.3528 - mae: 3.0734 - val_loss: 19.2254 - val_mae: 3.3040 Epoch 6/200 302/302 [==============================] - 0s 136us/sample - loss: 15.4658 - mae: 2.9788 - val_loss: 25.1259 - val_mae: 3.7582 Epoch 7/200 302/302 [==============================] - 0s 135us/sample - loss: 15.4141 - mae: 2.9182 - val_loss: 20.8011 - val_mae: 3.5197 Epoch 8/200 302/302 [==============================] - 0s 133us/sample - loss: 14.4969 - mae: 2.7620 - val_loss: 18.6348 - val_mae: 3.2949 Epoch 9/200 302/302 [==============================] - 0s 133us/sample - loss: 13.6265 - mae: 2.7044 - val_loss: 18.7770 - val_mae: 3.3109 Epoch 10/200 302/302 [==============================] - 0s 130us/sample - loss: 12.8704 - mae: 2.6011 - val_loss: 17.8500 - val_mae: 3.2827 Epoch 11/200 302/302 [==============================] - 0s 134us/sample - loss: 12.7328 - mae: 2.6056 - val_loss: 16.9646 - val_mae: 3.1106 Epoch 12/200 302/302 [==============================] - 0s 130us/sample - loss: 11.9772 - mae: 2.4880 - val_loss: 16.4886 - val_mae: 2.9806 Epoch 13/200 302/302 [==============================] - 0s 133us/sample - loss: 12.2674 - mae: 2.5737 - val_loss: 15.8808 - val_mae: 2.9495 Epoch 14/200 302/302 [==============================] - 0s 132us/sample - loss: 12.0098 - mae: 2.5383 - val_loss: 14.9710 - val_mae: 2.7830 Epoch 15/200 302/302 [==============================] - 0s 129us/sample - loss: 11.0381 - mae: 2.3954 - val_loss: 16.0094 - val_mae: 2.9028 Epoch 16/200 302/302 [==============================] - 0s 133us/sample - loss: 10.9368 - mae: 2.3913 - val_loss: 14.0289 - val_mae: 2.6543 Epoch 17/200 302/302 [==============================] - 0s 130us/sample - loss: 11.7403 - mae: 2.4969 - val_loss: 14.7687 - val_mae: 2.7206 Epoch 18/200 302/302 [==============================] - 0s 128us/sample - loss: 10.7369 - mae: 2.3487 - val_loss: 14.7919 - val_mae: 2.7628 Epoch 19/200 302/302 [==============================] - 0s 130us/sample - loss: 10.3863 - mae: 2.3361 - val_loss: 16.3471 - val_mae: 2.9074 Epoch 20/200 302/302 [==============================] - 0s 128us/sample - loss: 10.5169 - mae: 2.2802 - val_loss: 15.0218 - val_mae: 2.9117 Epoch 21/200 302/302 [==============================] - 0s 130us/sample - loss: 10.7745 - mae: 2.3662 - val_loss: 15.1342 - val_mae: 2.7854 Epoch 22/200 302/302 [==============================] - 0s 132us/sample - loss: 11.0256 - mae: 2.4168 - val_loss: 15.1119 - val_mae: 2.9965 Epoch 23/200 302/302 [==============================] - 0s 130us/sample - loss: 10.1336 - mae: 2.2894 - val_loss: 14.5207 - val_mae: 2.7149 Epoch 24/200 302/302 [==============================] - 0s 131us/sample - loss: 10.0693 - mae: 2.3018 - val_loss: 14.1408 - val_mae: 2.6306 Epoch 25/200 302/302 [==============================] - 0s 130us/sample - loss: 9.9548 - mae: 2.2865 - val_loss: 13.6693 - val_mae: 2.5868 Epoch 26/200 302/302 [==============================] - 0s 130us/sample - loss: 9.8436 - mae: 2.2471 - val_loss: 14.6073 - val_mae: 2.8419 Epoch 27/200 302/302 [==============================] - 0s 132us/sample - loss: 10.4879 - mae: 2.3334 - val_loss: 14.6313 - val_mae: 2.7397 Epoch 28/200 302/302 [==============================] - 0s 139us/sample - loss: 9.4997 - mae: 2.2002 - val_loss: 15.1965 - val_mae: 2.7661 Epoch 29/200 302/302 [==============================] - 0s 135us/sample - loss: 11.0160 - mae: 2.4267 - val_loss: 14.9117 - val_mae: 2.7071 Epoch 30/200 302/302 [==============================] - 0s 132us/sample - loss: 9.7989 - mae: 2.2688 - val_loss: 13.9872 - val_mae: 2.6249 Epoch 31/200 302/302 [==============================] - 0s 132us/sample - loss: 9.4423 - mae: 2.2222 - val_loss: 14.2936 - val_mae: 2.6527 Epoch 32/200 302/302 [==============================] - 0s 127us/sample - loss: 9.9526 - mae: 2.2806 - val_loss: 15.7277 - val_mae: 2.8775 Epoch 33/200 302/302 [==============================] - 0s 131us/sample - loss: 9.6058 - mae: 2.2469 - val_loss: 15.8726 - val_mae: 2.8041 Epoch 34/200 302/302 [==============================] - 0s 127us/sample - loss: 9.8932 - mae: 2.3156 - val_loss: 15.4808 - val_mae: 2.7932 Epoch 35/200 302/302 [==============================] - 0s 133us/sample - loss: 10.5763 - mae: 2.3146 - val_loss: 14.6180 - val_mae: 2.7011 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 48us/sample - loss: 8.9115 - mae: 2.7109 [CV] END learning_rate=0.005209877236352987, n_hidden=2, n_neurons=9; total time= 2.3s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 331.0825 - mae: 16.0669 - val_loss: 63.6898 - val_mae: 6.6288 Epoch 2/200 301/301 [==============================] - 0s 182us/sample - loss: 42.2963 - mae: 5.0351 - val_loss: 23.2436 - val_mae: 3.5237 Epoch 3/200 301/301 [==============================] - 0s 185us/sample - loss: 22.1147 - mae: 3.4368 - val_loss: 20.5565 - val_mae: 3.1157 Epoch 4/200 301/301 [==============================] - 0s 181us/sample - loss: 19.9703 - mae: 3.1603 - val_loss: 23.0573 - val_mae: 3.2947 Epoch 5/200 301/301 [==============================] - 0s 186us/sample - loss: 19.6979 - mae: 3.1958 - val_loss: 20.2063 - val_mae: 3.0933 Epoch 6/200 301/301 [==============================] - 0s 189us/sample - loss: 17.3404 - mae: 2.9075 - val_loss: 19.3155 - val_mae: 3.1856 Epoch 7/200 301/301 [==============================] - 0s 195us/sample - loss: 19.6615 - mae: 3.1840 - val_loss: 18.3785 - val_mae: 2.9978 Epoch 8/200 301/301 [==============================] - 0s 184us/sample - loss: 16.9819 - mae: 2.9564 - val_loss: 18.2491 - val_mae: 2.9944 Epoch 9/200 301/301 [==============================] - 0s 190us/sample - loss: 15.6104 - mae: 2.7859 - val_loss: 16.2961 - val_mae: 2.8722 Epoch 10/200 301/301 [==============================] - 0s 187us/sample - loss: 16.4312 - mae: 2.8953 - val_loss: 17.7136 - val_mae: 2.9375 Epoch 11/200 301/301 [==============================] - 0s 182us/sample - loss: 19.4020 - mae: 3.1423 - val_loss: 27.9056 - val_mae: 3.5904 Epoch 12/200 301/301 [==============================] - 0s 186us/sample - loss: 17.7705 - mae: 3.0393 - val_loss: 17.3887 - val_mae: 2.8230 Epoch 13/200 301/301 [==============================] - 0s 182us/sample - loss: 14.3905 - mae: 2.7525 - val_loss: 23.2427 - val_mae: 3.3327 Epoch 14/200 301/301 [==============================] - 0s 185us/sample - loss: 14.4630 - mae: 2.7977 - val_loss: 18.5677 - val_mae: 3.2184 Epoch 15/200 301/301 [==============================] - 0s 181us/sample - loss: 13.8232 - mae: 2.7519 - val_loss: 24.8684 - val_mae: 3.5256 Epoch 16/200 301/301 [==============================] - 0s 182us/sample - loss: 15.9835 - mae: 2.8852 - val_loss: 23.3624 - val_mae: 3.5931 Epoch 17/200 301/301 [==============================] - 0s 181us/sample - loss: 16.1965 - mae: 2.8224 - val_loss: 20.4808 - val_mae: 3.3145 Epoch 18/200 301/301 [==============================] - 0s 184us/sample - loss: 12.5072 - mae: 2.5043 - val_loss: 15.8952 - val_mae: 2.8546 Epoch 19/200 301/301 [==============================] - 0s 184us/sample - loss: 13.9708 - mae: 2.6890 - val_loss: 19.9638 - val_mae: 3.1898 Epoch 20/200 301/301 [==============================] - 0s 184us/sample - loss: 15.2839 - mae: 2.7908 - val_loss: 14.5026 - val_mae: 2.8439 Epoch 21/200 301/301 [==============================] - 0s 185us/sample - loss: 13.4481 - mae: 2.6011 - val_loss: 19.8733 - val_mae: 3.1782 Epoch 22/200 301/301 [==============================] - 0s 181us/sample - loss: 14.4368 - mae: 2.7584 - val_loss: 17.6767 - val_mae: 3.1850 Epoch 23/200 301/301 [==============================] - 0s 177us/sample - loss: 12.4660 - mae: 2.5281 - val_loss: 16.3823 - val_mae: 3.0125 Epoch 24/200 301/301 [==============================] - 0s 177us/sample - loss: 12.4096 - mae: 2.4889 - val_loss: 15.1558 - val_mae: 2.8888 Epoch 25/200 301/301 [==============================] - 0s 178us/sample - loss: 11.8447 - mae: 2.4574 - val_loss: 20.3222 - val_mae: 3.1415 Epoch 26/200 301/301 [==============================] - 0s 179us/sample - loss: 13.0772 - mae: 2.5432 - val_loss: 14.8496 - val_mae: 2.6899 Epoch 27/200 301/301 [==============================] - 0s 183us/sample - loss: 13.6219 - mae: 2.5236 - val_loss: 14.4634 - val_mae: 2.7728 Epoch 28/200 301/301 [==============================] - 0s 185us/sample - loss: 12.9951 - mae: 2.5564 - val_loss: 14.3034 - val_mae: 2.6584 Epoch 29/200 301/301 [==============================] - 0s 184us/sample - loss: 11.7434 - mae: 2.4425 - val_loss: 26.5096 - val_mae: 3.6379 Epoch 30/200 301/301 [==============================] - 0s 180us/sample - loss: 13.0649 - mae: 2.5368 - val_loss: 22.2452 - val_mae: 3.2131 Epoch 31/200 301/301 [==============================] - 0s 182us/sample - loss: 14.2159 - mae: 2.6517 - val_loss: 14.8501 - val_mae: 2.7867 Epoch 32/200 301/301 [==============================] - 0s 183us/sample - loss: 11.0574 - mae: 2.3942 - val_loss: 19.1480 - val_mae: 3.1480 Epoch 33/200 301/301 [==============================] - 0s 182us/sample - loss: 11.9399 - mae: 2.4126 - val_loss: 17.0342 - val_mae: 2.7787 Epoch 34/200 301/301 [==============================] - 0s 184us/sample - loss: 12.4471 - mae: 2.5174 - val_loss: 16.5411 - val_mae: 2.9088 Epoch 35/200 301/301 [==============================] - 0s 182us/sample - loss: 12.9433 - mae: 2.5666 - val_loss: 17.1676 - val_mae: 3.0344 Epoch 36/200 301/301 [==============================] - 0s 185us/sample - loss: 12.3398 - mae: 2.5104 - val_loss: 17.7483 - val_mae: 3.0527 Epoch 37/200 301/301 [==============================] - 0s 185us/sample - loss: 10.9693 - mae: 2.3153 - val_loss: 19.0020 - val_mae: 3.2601 Epoch 38/200 301/301 [==============================] - 0s 181us/sample - loss: 10.5694 - mae: 2.2985 - val_loss: 17.5260 - val_mae: 3.0454 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 63us/sample - loss: 18.2913 - mae: 3.3571 [CV] END learning_rate=0.003985152393924074, n_hidden=2, n_neurons=54; total time= 2.8s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 391.5439 - mae: 18.2321 - val_loss: 74.4722 - val_mae: 7.3816 Epoch 2/200 301/301 [==============================] - 0s 186us/sample - loss: 42.1716 - mae: 4.9787 - val_loss: 33.2338 - val_mae: 4.2246 Epoch 3/200 301/301 [==============================] - 0s 189us/sample - loss: 22.4624 - mae: 3.2936 - val_loss: 19.2065 - val_mae: 3.2529 Epoch 4/200 301/301 [==============================] - 0s 183us/sample - loss: 17.8427 - mae: 2.9765 - val_loss: 21.3843 - val_mae: 3.2604 Epoch 5/200 301/301 [==============================] - 0s 186us/sample - loss: 16.7412 - mae: 2.8961 - val_loss: 15.9518 - val_mae: 2.8446 Epoch 6/200 301/301 [==============================] - 0s 192us/sample - loss: 15.8460 - mae: 2.8022 - val_loss: 15.1935 - val_mae: 2.7383 Epoch 7/200 301/301 [==============================] - 0s 184us/sample - loss: 16.2214 - mae: 2.9208 - val_loss: 15.3136 - val_mae: 2.8017 Epoch 8/200 301/301 [==============================] - 0s 186us/sample - loss: 15.1550 - mae: 2.8348 - val_loss: 16.2488 - val_mae: 2.8871 Epoch 9/200 301/301 [==============================] - 0s 187us/sample - loss: 14.6052 - mae: 2.8727 - val_loss: 16.5428 - val_mae: 2.9475 Epoch 10/200 301/301 [==============================] - 0s 183us/sample - loss: 14.5575 - mae: 2.7069 - val_loss: 14.8731 - val_mae: 2.9529 Epoch 11/200 301/301 [==============================] - 0s 183us/sample - loss: 13.3832 - mae: 2.6410 - val_loss: 17.6570 - val_mae: 2.8517 Epoch 12/200 301/301 [==============================] - 0s 180us/sample - loss: 12.5403 - mae: 2.6162 - val_loss: 16.0357 - val_mae: 2.8828 Epoch 13/200 301/301 [==============================] - 0s 175us/sample - loss: 14.4089 - mae: 2.6316 - val_loss: 18.7523 - val_mae: 3.1608 Epoch 14/200 301/301 [==============================] - 0s 181us/sample - loss: 13.9429 - mae: 2.6442 - val_loss: 14.6356 - val_mae: 2.7082 Epoch 15/200 301/301 [==============================] - 0s 179us/sample - loss: 12.9296 - mae: 2.5937 - val_loss: 13.5298 - val_mae: 2.5888 Epoch 16/200 301/301 [==============================] - 0s 183us/sample - loss: 12.8478 - mae: 2.5039 - val_loss: 13.8912 - val_mae: 2.6272 Epoch 17/200 301/301 [==============================] - 0s 185us/sample - loss: 10.9708 - mae: 2.3050 - val_loss: 13.7876 - val_mae: 2.5798 Epoch 18/200 301/301 [==============================] - 0s 186us/sample - loss: 12.8831 - mae: 2.5223 - val_loss: 15.2754 - val_mae: 2.8077 Epoch 19/200 301/301 [==============================] - 0s 183us/sample - loss: 12.8505 - mae: 2.6156 - val_loss: 22.2081 - val_mae: 3.1909 Epoch 20/200 301/301 [==============================] - 0s 180us/sample - loss: 14.7899 - mae: 2.7611 - val_loss: 12.6190 - val_mae: 2.4009 Epoch 21/200 301/301 [==============================] - 0s 186us/sample - loss: 10.7846 - mae: 2.3211 - val_loss: 18.5060 - val_mae: 3.2742 Epoch 22/200 301/301 [==============================] - 0s 183us/sample - loss: 11.1877 - mae: 2.3877 - val_loss: 14.0825 - val_mae: 2.7438 Epoch 23/200 301/301 [==============================] - 0s 179us/sample - loss: 11.0158 - mae: 2.4135 - val_loss: 13.8232 - val_mae: 2.5808 Epoch 24/200 301/301 [==============================] - 0s 184us/sample - loss: 10.7579 - mae: 2.2971 - val_loss: 13.0682 - val_mae: 2.5779 Epoch 25/200 301/301 [==============================] - 0s 182us/sample - loss: 10.6763 - mae: 2.3261 - val_loss: 14.5776 - val_mae: 2.8694 Epoch 26/200 301/301 [==============================] - 0s 184us/sample - loss: 10.5151 - mae: 2.3105 - val_loss: 11.9171 - val_mae: 2.3819 Epoch 27/200 301/301 [==============================] - 0s 189us/sample - loss: 10.7366 - mae: 2.3214 - val_loss: 13.5614 - val_mae: 2.4925 Epoch 28/200 301/301 [==============================] - 0s 182us/sample - loss: 10.9256 - mae: 2.3577 - val_loss: 13.1938 - val_mae: 2.5716 Epoch 29/200 301/301 [==============================] - 0s 185us/sample - loss: 11.3883 - mae: 2.3439 - val_loss: 27.0928 - val_mae: 3.9471 Epoch 30/200 301/301 [==============================] - 0s 183us/sample - loss: 14.5729 - mae: 2.7400 - val_loss: 13.5470 - val_mae: 2.6159 Epoch 31/200 301/301 [==============================] - 0s 182us/sample - loss: 10.2653 - mae: 2.2320 - val_loss: 13.2128 - val_mae: 2.5728 Epoch 32/200 301/301 [==============================] - 0s 187us/sample - loss: 9.7429 - mae: 2.2013 - val_loss: 16.5017 - val_mae: 2.8858 Epoch 33/200 301/301 [==============================] - 0s 180us/sample - loss: 12.2133 - mae: 2.5297 - val_loss: 17.0029 - val_mae: 2.8018 Epoch 34/200 301/301 [==============================] - 0s 183us/sample - loss: 11.1452 - mae: 2.3489 - val_loss: 13.3628 - val_mae: 2.4296 Epoch 35/200 301/301 [==============================] - 0s 186us/sample - loss: 9.8847 - mae: 2.1953 - val_loss: 15.4609 - val_mae: 2.7895 Epoch 36/200 301/301 [==============================] - 0s 183us/sample - loss: 11.1920 - mae: 2.3601 - val_loss: 13.5887 - val_mae: 2.5180 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 61us/sample - loss: 9.5645 - mae: 2.4097 [CV] END learning_rate=0.003985152393924074, n_hidden=2, n_neurons=54; total time= 2.7s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 399.2173 - mae: 18.0671 - val_loss: 89.0051 - val_mae: 8.1094 Epoch 2/200 302/302 [==============================] - 0s 185us/sample - loss: 42.4063 - mae: 4.9425 - val_loss: 25.9473 - val_mae: 3.7435 Epoch 3/200 302/302 [==============================] - 0s 188us/sample - loss: 19.5788 - mae: 3.1576 - val_loss: 41.3244 - val_mae: 4.4400 Epoch 4/200 302/302 [==============================] - 0s 191us/sample - loss: 21.2564 - mae: 3.3468 - val_loss: 21.8512 - val_mae: 3.4715 Epoch 5/200 302/302 [==============================] - 0s 185us/sample - loss: 20.4311 - mae: 3.3115 - val_loss: 20.6977 - val_mae: 3.4223 Epoch 6/200 302/302 [==============================] - 0s 185us/sample - loss: 17.6245 - mae: 3.0139 - val_loss: 19.0395 - val_mae: 3.1470 Epoch 7/200 302/302 [==============================] - 0s 185us/sample - loss: 17.4769 - mae: 2.9838 - val_loss: 18.2072 - val_mae: 3.1174 Epoch 8/200 302/302 [==============================] - 0s 185us/sample - loss: 15.4494 - mae: 2.8452 - val_loss: 20.1914 - val_mae: 3.2440 Epoch 9/200 302/302 [==============================] - 0s 183us/sample - loss: 17.6499 - mae: 3.0294 - val_loss: 15.7965 - val_mae: 2.9401 Epoch 10/200 302/302 [==============================] - 0s 184us/sample - loss: 15.7822 - mae: 2.9388 - val_loss: 33.5512 - val_mae: 4.4113 Epoch 11/200 302/302 [==============================] - 0s 179us/sample - loss: 16.6118 - mae: 2.8863 - val_loss: 15.1843 - val_mae: 2.7767 Epoch 12/200 302/302 [==============================] - 0s 181us/sample - loss: 13.6465 - mae: 2.6329 - val_loss: 15.2677 - val_mae: 2.7897 Epoch 13/200 302/302 [==============================] - 0s 184us/sample - loss: 13.4933 - mae: 2.6214 - val_loss: 16.9048 - val_mae: 2.8448 Epoch 14/200 302/302 [==============================] - 0s 180us/sample - loss: 13.0036 - mae: 2.6232 - val_loss: 23.0319 - val_mae: 3.6890 Epoch 15/200 302/302 [==============================] - 0s 178us/sample - loss: 15.3269 - mae: 2.7980 - val_loss: 20.8623 - val_mae: 3.3762 Epoch 16/200 302/302 [==============================] - 0s 178us/sample - loss: 13.2688 - mae: 2.6634 - val_loss: 16.5644 - val_mae: 2.9923 Epoch 17/200 302/302 [==============================] - 0s 176us/sample - loss: 13.1924 - mae: 2.6681 - val_loss: 21.6627 - val_mae: 3.3474 Epoch 18/200 302/302 [==============================] - 0s 178us/sample - loss: 13.0810 - mae: 2.5854 - val_loss: 18.3399 - val_mae: 2.9997 Epoch 19/200 302/302 [==============================] - 0s 180us/sample - loss: 14.5744 - mae: 2.6260 - val_loss: 15.4702 - val_mae: 2.6733 Epoch 20/200 302/302 [==============================] - 0s 184us/sample - loss: 12.0824 - mae: 2.4691 - val_loss: 18.9317 - val_mae: 3.1158 Epoch 21/200 302/302 [==============================] - 0s 187us/sample - loss: 11.2519 - mae: 2.4629 - val_loss: 19.0578 - val_mae: 3.1958 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 64us/sample - loss: 26.2692 - mae: 3.3228 [CV] END learning_rate=0.003985152393924074, n_hidden=2, n_neurons=54; total time= 2.1s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 365.6587 - mae: 17.4239 - val_loss: 103.0989 - val_mae: 8.7851 Epoch 2/200 302/302 [==============================] - 0s 189us/sample - loss: 50.3198 - mae: 5.6274 - val_loss: 32.3384 - val_mae: 4.2877 Epoch 3/200 302/302 [==============================] - 0s 188us/sample - loss: 20.6733 - mae: 3.3868 - val_loss: 24.9716 - val_mae: 3.5430 Epoch 4/200 302/302 [==============================] - 0s 187us/sample - loss: 21.8113 - mae: 3.3885 - val_loss: 26.6301 - val_mae: 3.5831 Epoch 5/200 302/302 [==============================] - 0s 183us/sample - loss: 16.7547 - mae: 2.9305 - val_loss: 22.4827 - val_mae: 3.4302 Epoch 6/200 302/302 [==============================] - 0s 183us/sample - loss: 14.3811 - mae: 2.6953 - val_loss: 16.0998 - val_mae: 2.8325 Epoch 7/200 302/302 [==============================] - 0s 183us/sample - loss: 15.2599 - mae: 2.8645 - val_loss: 17.2393 - val_mae: 3.0965 Epoch 8/200 302/302 [==============================] - 0s 184us/sample - loss: 14.9776 - mae: 2.8334 - val_loss: 24.5478 - val_mae: 3.5959 Epoch 9/200 302/302 [==============================] - 0s 183us/sample - loss: 12.7838 - mae: 2.5213 - val_loss: 15.0808 - val_mae: 2.7920 Epoch 10/200 302/302 [==============================] - 0s 189us/sample - loss: 14.2751 - mae: 2.6890 - val_loss: 19.9567 - val_mae: 3.1759 Epoch 11/200 302/302 [==============================] - 0s 185us/sample - loss: 13.7163 - mae: 2.7111 - val_loss: 26.8427 - val_mae: 3.6259 Epoch 12/200 302/302 [==============================] - 0s 180us/sample - loss: 19.0751 - mae: 3.1639 - val_loss: 20.5017 - val_mae: 3.1874 Epoch 13/200 302/302 [==============================] - 0s 183us/sample - loss: 12.7647 - mae: 2.5283 - val_loss: 15.1722 - val_mae: 2.7124 Epoch 14/200 302/302 [==============================] - 0s 186us/sample - loss: 12.5510 - mae: 2.5238 - val_loss: 14.4556 - val_mae: 2.6576 Epoch 15/200 302/302 [==============================] - 0s 187us/sample - loss: 13.3492 - mae: 2.5087 - val_loss: 35.9604 - val_mae: 4.2437 Epoch 16/200 302/302 [==============================] - 0s 183us/sample - loss: 14.2663 - mae: 2.7557 - val_loss: 19.7580 - val_mae: 3.1819 Epoch 17/200 302/302 [==============================] - 0s 184us/sample - loss: 13.4103 - mae: 2.4849 - val_loss: 18.1851 - val_mae: 3.0047 Epoch 18/200 302/302 [==============================] - 0s 188us/sample - loss: 13.6108 - mae: 2.6694 - val_loss: 16.1223 - val_mae: 2.8325 Epoch 19/200 302/302 [==============================] - 0s 178us/sample - loss: 11.3674 - mae: 2.4355 - val_loss: 17.1384 - val_mae: 2.7900 Epoch 20/200 302/302 [==============================] - 0s 180us/sample - loss: 10.8290 - mae: 2.3920 - val_loss: 14.7684 - val_mae: 2.5704 Epoch 21/200 302/302 [==============================] - 0s 185us/sample - loss: 12.7786 - mae: 2.5336 - val_loss: 15.5383 - val_mae: 2.7108 Epoch 22/200 302/302 [==============================] - 0s 184us/sample - loss: 10.9464 - mae: 2.3477 - val_loss: 19.3497 - val_mae: 3.1024 Epoch 23/200 302/302 [==============================] - 0s 179us/sample - loss: 11.0638 - mae: 2.3306 - val_loss: 25.3786 - val_mae: 3.4427 Epoch 24/200 302/302 [==============================] - 0s 183us/sample - loss: 11.1271 - mae: 2.3356 - val_loss: 14.9296 - val_mae: 2.6805 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 61us/sample - loss: 11.8136 - mae: 2.6394 [CV] END learning_rate=0.003985152393924074, n_hidden=2, n_neurons=54; total time= 2.0s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 553.9404 - mae: 21.8333 - val_loss: 463.3083 - val_mae: 20.4506 Epoch 2/200 301/301 [==============================] - 0s 137us/sample - loss: 422.0405 - mae: 19.5595 - val_loss: 332.5925 - val_mae: 17.5037 Epoch 3/200 301/301 [==============================] - 0s 131us/sample - loss: 294.7219 - mae: 16.1290 - val_loss: 219.6608 - val_mae: 13.8550 Epoch 4/200 301/301 [==============================] - 0s 130us/sample - loss: 191.0033 - mae: 12.5179 - val_loss: 141.6551 - val_mae: 10.7250 Epoch 5/200 301/301 [==============================] - 0s 133us/sample - loss: 123.6560 - mae: 9.7901 - val_loss: 91.0738 - val_mae: 8.5217 Epoch 6/200 301/301 [==============================] - 0s 137us/sample - loss: 75.6697 - mae: 7.4542 - val_loss: 49.7944 - val_mae: 6.1577 Epoch 7/200 301/301 [==============================] - 0s 134us/sample - loss: 39.6821 - mae: 5.1446 - val_loss: 25.9110 - val_mae: 4.0937 Epoch 8/200 301/301 [==============================] - 0s 131us/sample - loss: 23.8642 - mae: 3.7377 - val_loss: 19.6528 - val_mae: 3.2122 Epoch 9/200 301/301 [==============================] - 0s 136us/sample - loss: 20.2035 - mae: 3.3196 - val_loss: 19.1014 - val_mae: 3.2123 Epoch 10/200 301/301 [==============================] - 0s 136us/sample - loss: 19.0794 - mae: 3.2631 - val_loss: 18.9043 - val_mae: 3.1993 Epoch 11/200 301/301 [==============================] - 0s 136us/sample - loss: 18.5237 - mae: 3.1897 - val_loss: 18.9622 - val_mae: 3.2193 Epoch 12/200 301/301 [==============================] - 0s 133us/sample - loss: 18.4212 - mae: 3.1899 - val_loss: 18.7088 - val_mae: 3.2007 Epoch 13/200 301/301 [==============================] - 0s 136us/sample - loss: 17.6398 - mae: 3.1364 - val_loss: 19.8414 - val_mae: 3.2323 Epoch 14/200 301/301 [==============================] - 0s 135us/sample - loss: 16.8872 - mae: 3.0433 - val_loss: 18.4064 - val_mae: 3.1595 Epoch 15/200 301/301 [==============================] - 0s 135us/sample - loss: 16.9254 - mae: 3.0034 - val_loss: 17.2911 - val_mae: 3.1035 Epoch 16/200 301/301 [==============================] - 0s 135us/sample - loss: 15.7982 - mae: 2.9785 - val_loss: 17.3841 - val_mae: 3.1291 Epoch 17/200 301/301 [==============================] - 0s 135us/sample - loss: 14.8816 - mae: 2.8368 - val_loss: 16.9088 - val_mae: 3.0990 Epoch 18/200 301/301 [==============================] - 0s 133us/sample - loss: 14.6838 - mae: 2.8177 - val_loss: 18.2066 - val_mae: 3.1401 Epoch 19/200 301/301 [==============================] - 0s 133us/sample - loss: 14.8854 - mae: 2.8310 - val_loss: 15.4670 - val_mae: 2.9165 Epoch 20/200 301/301 [==============================] - 0s 133us/sample - loss: 13.9636 - mae: 2.7364 - val_loss: 15.4862 - val_mae: 2.9522 Epoch 21/200 301/301 [==============================] - 0s 126us/sample - loss: 14.0600 - mae: 2.7662 - val_loss: 16.1338 - val_mae: 2.9450 Epoch 22/200 301/301 [==============================] - 0s 130us/sample - loss: 13.4086 - mae: 2.6806 - val_loss: 14.4572 - val_mae: 2.7958 Epoch 23/200 301/301 [==============================] - 0s 125us/sample - loss: 13.2627 - mae: 2.6637 - val_loss: 14.3775 - val_mae: 2.7889 Epoch 24/200 301/301 [==============================] - 0s 126us/sample - loss: 13.6299 - mae: 2.6870 - val_loss: 14.5861 - val_mae: 2.8528 Epoch 25/200 301/301 [==============================] - 0s 128us/sample - loss: 13.0076 - mae: 2.6268 - val_loss: 14.1390 - val_mae: 2.7553 Epoch 26/200 301/301 [==============================] - 0s 127us/sample - loss: 13.3341 - mae: 2.6716 - val_loss: 14.1320 - val_mae: 2.7868 Epoch 27/200 301/301 [==============================] - 0s 132us/sample - loss: 12.5630 - mae: 2.6094 - val_loss: 13.5969 - val_mae: 2.7242 Epoch 28/200 301/301 [==============================] - 0s 129us/sample - loss: 12.5915 - mae: 2.5852 - val_loss: 13.4005 - val_mae: 2.6844 Epoch 29/200 301/301 [==============================] - 0s 127us/sample - loss: 12.2328 - mae: 2.5304 - val_loss: 14.2916 - val_mae: 2.7548 Epoch 30/200 301/301 [==============================] - 0s 129us/sample - loss: 12.3196 - mae: 2.5886 - val_loss: 14.1476 - val_mae: 2.7303 Epoch 31/200 301/301 [==============================] - 0s 131us/sample - loss: 11.8165 - mae: 2.5072 - val_loss: 13.1991 - val_mae: 2.6976 Epoch 32/200 301/301 [==============================] - 0s 130us/sample - loss: 12.0916 - mae: 2.5290 - val_loss: 13.1062 - val_mae: 2.6563 Epoch 33/200 301/301 [==============================] - 0s 128us/sample - loss: 12.8755 - mae: 2.6007 - val_loss: 15.1562 - val_mae: 2.8395 Epoch 34/200 301/301 [==============================] - 0s 133us/sample - loss: 12.5656 - mae: 2.5981 - val_loss: 15.1301 - val_mae: 2.8367 Epoch 35/200 301/301 [==============================] - 0s 128us/sample - loss: 12.4092 - mae: 2.6094 - val_loss: 14.1373 - val_mae: 2.7781 Epoch 36/200 301/301 [==============================] - 0s 128us/sample - loss: 11.8437 - mae: 2.4876 - val_loss: 12.9600 - val_mae: 2.6852 Epoch 37/200 301/301 [==============================] - 0s 132us/sample - loss: 11.0910 - mae: 2.4171 - val_loss: 14.7747 - val_mae: 2.7971 Epoch 38/200 301/301 [==============================] - 0s 125us/sample - loss: 12.4243 - mae: 2.5859 - val_loss: 13.4325 - val_mae: 2.7217 Epoch 39/200 301/301 [==============================] - 0s 130us/sample - loss: 11.6312 - mae: 2.5398 - val_loss: 13.0277 - val_mae: 2.7060 Epoch 40/200 301/301 [==============================] - 0s 131us/sample - loss: 11.4588 - mae: 2.4710 - val_loss: 12.9239 - val_mae: 2.6268 Epoch 41/200 301/301 [==============================] - 0s 125us/sample - loss: 11.3151 - mae: 2.4388 - val_loss: 13.1762 - val_mae: 2.6633 Epoch 42/200 301/301 [==============================] - 0s 134us/sample - loss: 11.4549 - mae: 2.4469 - val_loss: 12.7963 - val_mae: 2.6400 Epoch 43/200 301/301 [==============================] - 0s 132us/sample - loss: 11.0746 - mae: 2.4432 - val_loss: 12.8194 - val_mae: 2.6120 Epoch 44/200 301/301 [==============================] - 0s 128us/sample - loss: 12.2193 - mae: 2.5363 - val_loss: 12.4996 - val_mae: 2.6099 Epoch 45/200 301/301 [==============================] - 0s 129us/sample - loss: 11.1845 - mae: 2.4639 - val_loss: 12.6013 - val_mae: 2.5688 Epoch 46/200 301/301 [==============================] - 0s 134us/sample - loss: 11.3690 - mae: 2.4639 - val_loss: 12.4881 - val_mae: 2.6019 Epoch 47/200 301/301 [==============================] - 0s 131us/sample - loss: 11.3223 - mae: 2.4747 - val_loss: 12.5969 - val_mae: 2.6083 Epoch 48/200 301/301 [==============================] - 0s 133us/sample - loss: 11.2551 - mae: 2.4280 - val_loss: 12.7112 - val_mae: 2.5745 Epoch 49/200 301/301 [==============================] - 0s 139us/sample - loss: 11.1576 - mae: 2.4339 - val_loss: 12.7821 - val_mae: 2.5993 Epoch 50/200 301/301 [==============================] - 0s 125us/sample - loss: 10.7691 - mae: 2.3850 - val_loss: 13.5075 - val_mae: 2.6911 Epoch 51/200 301/301 [==============================] - 0s 132us/sample - loss: 11.3576 - mae: 2.5061 - val_loss: 13.0989 - val_mae: 2.6463 Epoch 52/200 301/301 [==============================] - 0s 134us/sample - loss: 10.8699 - mae: 2.3812 - val_loss: 12.7457 - val_mae: 2.6269 Epoch 53/200 301/301 [==============================] - 0s 129us/sample - loss: 10.9961 - mae: 2.3802 - val_loss: 12.5788 - val_mae: 2.5801 Epoch 54/200 301/301 [==============================] - 0s 141us/sample - loss: 10.6524 - mae: 2.3838 - val_loss: 12.6631 - val_mae: 2.5765 Epoch 55/200 301/301 [==============================] - 0s 129us/sample - loss: 11.0406 - mae: 2.4067 - val_loss: 15.1503 - val_mae: 2.9177 Epoch 56/200 301/301 [==============================] - 0s 134us/sample - loss: 11.0741 - mae: 2.4342 - val_loss: 12.5107 - val_mae: 2.6023 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 43us/sample - loss: 11.3147 - mae: 2.3718 [CV] END learning_rate=0.002550344321025771, n_hidden=2, n_neurons=7; total time= 2.9s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 2ms/sample - loss: 540.7278 - mae: 21.7576 - val_loss: 496.6364 - val_mae: 20.8942 Epoch 2/200 301/301 [==============================] - 0s 131us/sample - loss: 461.4440 - mae: 20.0949 - val_loss: 412.0260 - val_mae: 18.8719 Epoch 3/200 301/301 [==============================] - 0s 133us/sample - loss: 376.0373 - mae: 17.8563 - val_loss: 326.1413 - val_mae: 16.3528 Epoch 4/200 301/301 [==============================] - 0s 135us/sample - loss: 294.0186 - mae: 15.3240 - val_loss: 248.9142 - val_mae: 13.8563 Epoch 5/200 301/301 [==============================] - 0s 138us/sample - loss: 222.2784 - mae: 12.9679 - val_loss: 183.6819 - val_mae: 11.4866 Epoch 6/200 301/301 [==============================] - 0s 137us/sample - loss: 161.5747 - mae: 10.6735 - val_loss: 128.2548 - val_mae: 9.1785 Epoch 7/200 301/301 [==============================] - 0s 139us/sample - loss: 111.0659 - mae: 8.4930 - val_loss: 83.7449 - val_mae: 6.9499 Epoch 8/200 301/301 [==============================] - 0s 134us/sample - loss: 71.7052 - mae: 6.4960 - val_loss: 51.3198 - val_mae: 4.9888 Epoch 9/200 301/301 [==============================] - 0s 140us/sample - loss: 45.5301 - mae: 4.9997 - val_loss: 34.4526 - val_mae: 3.9020 Epoch 10/200 301/301 [==============================] - 0s 133us/sample - loss: 31.5882 - mae: 4.1407 - val_loss: 27.0951 - val_mae: 3.5524 Epoch 11/200 301/301 [==============================] - 0s 138us/sample - loss: 26.3015 - mae: 3.7925 - val_loss: 23.9854 - val_mae: 3.4538 Epoch 12/200 301/301 [==============================] - 0s 132us/sample - loss: 23.8938 - mae: 3.6336 - val_loss: 22.3131 - val_mae: 3.4017 Epoch 13/200 301/301 [==============================] - 0s 136us/sample - loss: 22.4873 - mae: 3.5445 - val_loss: 23.6886 - val_mae: 3.5323 Epoch 14/200 301/301 [==============================] - 0s 130us/sample - loss: 21.2319 - mae: 3.4162 - val_loss: 21.7219 - val_mae: 3.3406 Epoch 15/200 301/301 [==============================] - 0s 130us/sample - loss: 21.6509 - mae: 3.4611 - val_loss: 21.4547 - val_mae: 3.3518 Epoch 16/200 301/301 [==============================] - 0s 131us/sample - loss: 20.5168 - mae: 3.3606 - val_loss: 21.8248 - val_mae: 3.3768 Epoch 17/200 301/301 [==============================] - 0s 126us/sample - loss: 19.7809 - mae: 3.3053 - val_loss: 21.1256 - val_mae: 3.2431 Epoch 18/200 301/301 [==============================] - 0s 133us/sample - loss: 18.8968 - mae: 3.2178 - val_loss: 20.5197 - val_mae: 3.2790 Epoch 19/200 301/301 [==============================] - 0s 132us/sample - loss: 18.0637 - mae: 3.1531 - val_loss: 19.9854 - val_mae: 3.2580 Epoch 20/200 301/301 [==============================] - 0s 136us/sample - loss: 17.5085 - mae: 3.0844 - val_loss: 19.8957 - val_mae: 3.1892 Epoch 21/200 301/301 [==============================] - 0s 132us/sample - loss: 17.0389 - mae: 3.0500 - val_loss: 19.3509 - val_mae: 3.1659 Epoch 22/200 301/301 [==============================] - 0s 132us/sample - loss: 16.7266 - mae: 3.0109 - val_loss: 19.1379 - val_mae: 3.1654 Epoch 23/200 301/301 [==============================] - 0s 129us/sample - loss: 15.9759 - mae: 2.9162 - val_loss: 17.7099 - val_mae: 3.0092 Epoch 24/200 301/301 [==============================] - 0s 133us/sample - loss: 15.6084 - mae: 2.8987 - val_loss: 17.2517 - val_mae: 2.9496 Epoch 25/200 301/301 [==============================] - 0s 138us/sample - loss: 15.6139 - mae: 2.9014 - val_loss: 16.7205 - val_mae: 2.9115 Epoch 26/200 301/301 [==============================] - 0s 129us/sample - loss: 14.7134 - mae: 2.8102 - val_loss: 16.0193 - val_mae: 2.8386 Epoch 27/200 301/301 [==============================] - 0s 136us/sample - loss: 14.3922 - mae: 2.7575 - val_loss: 16.0890 - val_mae: 2.8569 Epoch 28/200 301/301 [==============================] - 0s 130us/sample - loss: 14.0527 - mae: 2.7244 - val_loss: 15.4903 - val_mae: 2.8153 Epoch 29/200 301/301 [==============================] - 0s 138us/sample - loss: 13.7016 - mae: 2.6835 - val_loss: 15.9930 - val_mae: 2.8294 Epoch 30/200 301/301 [==============================] - 0s 133us/sample - loss: 13.6526 - mae: 2.6545 - val_loss: 14.7359 - val_mae: 2.6841 Epoch 31/200 301/301 [==============================] - 0s 126us/sample - loss: 13.2360 - mae: 2.6367 - val_loss: 14.7144 - val_mae: 2.7044 Epoch 32/200 301/301 [==============================] - 0s 135us/sample - loss: 12.9291 - mae: 2.5761 - val_loss: 15.1681 - val_mae: 2.7584 Epoch 33/200 301/301 [==============================] - 0s 129us/sample - loss: 13.2274 - mae: 2.6714 - val_loss: 15.3057 - val_mae: 2.7475 Epoch 34/200 301/301 [==============================] - 0s 129us/sample - loss: 13.1347 - mae: 2.5987 - val_loss: 14.7127 - val_mae: 2.6987 Epoch 35/200 301/301 [==============================] - 0s 136us/sample - loss: 12.4492 - mae: 2.5372 - val_loss: 15.6739 - val_mae: 2.8061 Epoch 36/200 301/301 [==============================] - 0s 132us/sample - loss: 12.3731 - mae: 2.5389 - val_loss: 14.0105 - val_mae: 2.6512 Epoch 37/200 301/301 [==============================] - 0s 134us/sample - loss: 12.1334 - mae: 2.4884 - val_loss: 14.8577 - val_mae: 2.7173 Epoch 38/200 301/301 [==============================] - 0s 126us/sample - loss: 12.1071 - mae: 2.4897 - val_loss: 14.0433 - val_mae: 2.6699 Epoch 39/200 301/301 [==============================] - 0s 136us/sample - loss: 11.9023 - mae: 2.5168 - val_loss: 16.6708 - val_mae: 2.9170 Epoch 40/200 301/301 [==============================] - 0s 131us/sample - loss: 12.0765 - mae: 2.4582 - val_loss: 13.9442 - val_mae: 2.6357 Epoch 41/200 301/301 [==============================] - 0s 132us/sample - loss: 11.8788 - mae: 2.4777 - val_loss: 15.2543 - val_mae: 2.7760 Epoch 42/200 301/301 [==============================] - 0s 133us/sample - loss: 12.2153 - mae: 2.5361 - val_loss: 13.5738 - val_mae: 2.6368 Epoch 43/200 301/301 [==============================] - 0s 130us/sample - loss: 11.4886 - mae: 2.4167 - val_loss: 13.6590 - val_mae: 2.6547 Epoch 44/200 301/301 [==============================] - 0s 131us/sample - loss: 11.4476 - mae: 2.4320 - val_loss: 13.0557 - val_mae: 2.5970 Epoch 45/200 301/301 [==============================] - 0s 132us/sample - loss: 11.3026 - mae: 2.4393 - val_loss: 13.3795 - val_mae: 2.6163 Epoch 46/200 301/301 [==============================] - 0s 128us/sample - loss: 11.3294 - mae: 2.4455 - val_loss: 13.8988 - val_mae: 2.6551 Epoch 47/200 301/301 [==============================] - 0s 135us/sample - loss: 11.4093 - mae: 2.4059 - val_loss: 13.8197 - val_mae: 2.6473 Epoch 48/200 301/301 [==============================] - 0s 128us/sample - loss: 11.3344 - mae: 2.3888 - val_loss: 13.7363 - val_mae: 2.6329 Epoch 49/200 301/301 [==============================] - 0s 135us/sample - loss: 11.0092 - mae: 2.3967 - val_loss: 13.8740 - val_mae: 2.6486 Epoch 50/200 301/301 [==============================] - 0s 132us/sample - loss: 11.1659 - mae: 2.4298 - val_loss: 13.8704 - val_mae: 2.6611 Epoch 51/200 301/301 [==============================] - 0s 135us/sample - loss: 10.9833 - mae: 2.3863 - val_loss: 16.3986 - val_mae: 2.8963 Epoch 52/200 301/301 [==============================] - 0s 135us/sample - loss: 11.3225 - mae: 2.4143 - val_loss: 13.4386 - val_mae: 2.6327 Epoch 53/200 301/301 [==============================] - 0s 132us/sample - loss: 10.7213 - mae: 2.3507 - val_loss: 13.6284 - val_mae: 2.6821 Epoch 54/200 301/301 [==============================] - 0s 132us/sample - loss: 10.8123 - mae: 2.3937 - val_loss: 13.5112 - val_mae: 2.6606 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 43us/sample - loss: 8.7568 - mae: 2.4398 [CV] END learning_rate=0.002550344321025771, n_hidden=2, n_neurons=7; total time= 2.9s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 3ms/sample - loss: 597.6098 - mae: 21.9423 - val_loss: 549.9388 - val_mae: 21.3407 Epoch 2/200 302/302 [==============================] - 0s 138us/sample - loss: 525.7384 - mae: 20.9801 - val_loss: 479.1754 - val_mae: 20.1862 Epoch 3/200 302/302 [==============================] - 0s 137us/sample - loss: 451.3383 - mae: 19.5951 - val_loss: 393.2096 - val_mae: 18.3030 Epoch 4/200 302/302 [==============================] - 0s 139us/sample - loss: 357.1033 - mae: 17.3445 - val_loss: 291.6206 - val_mae: 15.6059 Epoch 5/200 302/302 [==============================] - 0s 137us/sample - loss: 253.5459 - mae: 14.4722 - val_loss: 188.4035 - val_mae: 12.2486 Epoch 6/200 302/302 [==============================] - 0s 137us/sample - loss: 155.1342 - mae: 10.9312 - val_loss: 100.9521 - val_mae: 8.3545 Epoch 7/200 302/302 [==============================] - 0s 140us/sample - loss: 76.5200 - mae: 7.0712 - val_loss: 46.0892 - val_mae: 5.1141 Epoch 8/200 302/302 [==============================] - 0s 139us/sample - loss: 35.0881 - mae: 4.3887 - val_loss: 27.2581 - val_mae: 3.7516 Epoch 9/200 302/302 [==============================] - 0s 132us/sample - loss: 23.9409 - mae: 3.7222 - val_loss: 23.6090 - val_mae: 3.6691 Epoch 10/200 302/302 [==============================] - 0s 136us/sample - loss: 21.7685 - mae: 3.5293 - val_loss: 22.3753 - val_mae: 3.5531 Epoch 11/200 302/302 [==============================] - 0s 132us/sample - loss: 20.8633 - mae: 3.5021 - val_loss: 21.9297 - val_mae: 3.5066 Epoch 12/200 302/302 [==============================] - 0s 130us/sample - loss: 19.8298 - mae: 3.3821 - val_loss: 21.9966 - val_mae: 3.5541 Epoch 13/200 302/302 [==============================] - 0s 132us/sample - loss: 19.7978 - mae: 3.3870 - val_loss: 21.8734 - val_mae: 3.5504 Epoch 14/200 302/302 [==============================] - 0s 136us/sample - loss: 18.6797 - mae: 3.2926 - val_loss: 21.9961 - val_mae: 3.5251 Epoch 15/200 302/302 [==============================] - 0s 127us/sample - loss: 18.0261 - mae: 3.2582 - val_loss: 22.1782 - val_mae: 3.6015 Epoch 16/200 302/302 [==============================] - 0s 132us/sample - loss: 17.6682 - mae: 3.2393 - val_loss: 20.5938 - val_mae: 3.4613 Epoch 17/200 302/302 [==============================] - 0s 220us/sample - loss: 16.7783 - mae: 3.1062 - val_loss: 19.7675 - val_mae: 3.3929 Epoch 18/200 302/302 [==============================] - 0s 141us/sample - loss: 16.5502 - mae: 3.0908 - val_loss: 19.1865 - val_mae: 3.2903 Epoch 19/200 302/302 [==============================] - 0s 132us/sample - loss: 16.1994 - mae: 3.0443 - val_loss: 18.7795 - val_mae: 3.3098 Epoch 20/200 302/302 [==============================] - 0s 132us/sample - loss: 16.0490 - mae: 3.0842 - val_loss: 18.3562 - val_mae: 3.2669 Epoch 21/200 302/302 [==============================] - 0s 132us/sample - loss: 15.1287 - mae: 2.9874 - val_loss: 18.2431 - val_mae: 3.1972 Epoch 22/200 302/302 [==============================] - 0s 139us/sample - loss: 14.8730 - mae: 2.9281 - val_loss: 17.8248 - val_mae: 3.2241 Epoch 23/200 302/302 [==============================] - 0s 134us/sample - loss: 15.1538 - mae: 2.9465 - val_loss: 17.9161 - val_mae: 3.1614 Epoch 24/200 302/302 [==============================] - 0s 135us/sample - loss: 14.7473 - mae: 2.9046 - val_loss: 16.9610 - val_mae: 3.1001 Epoch 25/200 302/302 [==============================] - 0s 137us/sample - loss: 14.4969 - mae: 2.8762 - val_loss: 16.6584 - val_mae: 3.0513 Epoch 26/200 302/302 [==============================] - 0s 135us/sample - loss: 14.0895 - mae: 2.8253 - val_loss: 16.6788 - val_mae: 3.1071 Epoch 27/200 302/302 [==============================] - 0s 130us/sample - loss: 14.1206 - mae: 2.8496 - val_loss: 16.4548 - val_mae: 3.0599 Epoch 28/200 302/302 [==============================] - 0s 132us/sample - loss: 14.3171 - mae: 2.8461 - val_loss: 16.0057 - val_mae: 3.0109 Epoch 29/200 302/302 [==============================] - 0s 129us/sample - loss: 13.2417 - mae: 2.7592 - val_loss: 15.3177 - val_mae: 2.9015 Epoch 30/200 302/302 [==============================] - 0s 132us/sample - loss: 12.9457 - mae: 2.7065 - val_loss: 15.9867 - val_mae: 2.9834 Epoch 31/200 302/302 [==============================] - 0s 130us/sample - loss: 13.1959 - mae: 2.7460 - val_loss: 14.9100 - val_mae: 2.8718 Epoch 32/200 302/302 [==============================] - 0s 127us/sample - loss: 12.7936 - mae: 2.6861 - val_loss: 15.3993 - val_mae: 2.9230 Epoch 33/200 302/302 [==============================] - 0s 133us/sample - loss: 13.0978 - mae: 2.7309 - val_loss: 14.4684 - val_mae: 2.8107 Epoch 34/200 302/302 [==============================] - 0s 130us/sample - loss: 13.0021 - mae: 2.7168 - val_loss: 14.2056 - val_mae: 2.7783 Epoch 35/200 302/302 [==============================] - 0s 125us/sample - loss: 12.5217 - mae: 2.6806 - val_loss: 14.3498 - val_mae: 2.7869 Epoch 36/200 302/302 [==============================] - 0s 133us/sample - loss: 12.3568 - mae: 2.6416 - val_loss: 14.3341 - val_mae: 2.8222 Epoch 37/200 302/302 [==============================] - 0s 127us/sample - loss: 12.2806 - mae: 2.6446 - val_loss: 14.0746 - val_mae: 2.7423 Epoch 38/200 302/302 [==============================] - 0s 132us/sample - loss: 12.0863 - mae: 2.5801 - val_loss: 14.6288 - val_mae: 2.8540 Epoch 39/200 302/302 [==============================] - 0s 131us/sample - loss: 12.1462 - mae: 2.5931 - val_loss: 17.9845 - val_mae: 3.1181 Epoch 40/200 302/302 [==============================] - 0s 131us/sample - loss: 12.5252 - mae: 2.6190 - val_loss: 14.1153 - val_mae: 2.7594 Epoch 41/200 302/302 [==============================] - 0s 135us/sample - loss: 12.0963 - mae: 2.5817 - val_loss: 13.6048 - val_mae: 2.6807 Epoch 42/200 302/302 [==============================] - 0s 131us/sample - loss: 11.8000 - mae: 2.5837 - val_loss: 13.5493 - val_mae: 2.6514 Epoch 43/200 302/302 [==============================] - 0s 134us/sample - loss: 12.1528 - mae: 2.5839 - val_loss: 13.5966 - val_mae: 2.6827 Epoch 44/200 302/302 [==============================] - 0s 130us/sample - loss: 12.2801 - mae: 2.5880 - val_loss: 13.8630 - val_mae: 2.7100 Epoch 45/200 302/302 [==============================] - 0s 129us/sample - loss: 11.5913 - mae: 2.5683 - val_loss: 14.1763 - val_mae: 2.7247 Epoch 46/200 302/302 [==============================] - 0s 132us/sample - loss: 11.4035 - mae: 2.5054 - val_loss: 14.2400 - val_mae: 2.7995 Epoch 47/200 302/302 [==============================] - 0s 128us/sample - loss: 11.4500 - mae: 2.5043 - val_loss: 14.2068 - val_mae: 2.7679 Epoch 48/200 302/302 [==============================] - 0s 134us/sample - loss: 11.3107 - mae: 2.4957 - val_loss: 13.1204 - val_mae: 2.6044 Epoch 49/200 302/302 [==============================] - 0s 129us/sample - loss: 11.3542 - mae: 2.5037 - val_loss: 13.1667 - val_mae: 2.5764 Epoch 50/200 302/302 [==============================] - 0s 132us/sample - loss: 11.3137 - mae: 2.4818 - val_loss: 14.5980 - val_mae: 2.8206 Epoch 51/200 302/302 [==============================] - 0s 127us/sample - loss: 11.4359 - mae: 2.5200 - val_loss: 14.4836 - val_mae: 2.7850 Epoch 52/200 302/302 [==============================] - 0s 130us/sample - loss: 11.3165 - mae: 2.4914 - val_loss: 13.1623 - val_mae: 2.5895 Epoch 53/200 302/302 [==============================] - 0s 133us/sample - loss: 11.4442 - mae: 2.4834 - val_loss: 13.4572 - val_mae: 2.6707 Epoch 54/200 302/302 [==============================] - 0s 131us/sample - loss: 11.1758 - mae: 2.4864 - val_loss: 13.8683 - val_mae: 2.7048 Epoch 55/200 302/302 [==============================] - 0s 132us/sample - loss: 11.5370 - mae: 2.5057 - val_loss: 13.1543 - val_mae: 2.6136 Epoch 56/200 302/302 [==============================] - 0s 132us/sample - loss: 11.3540 - mae: 2.4922 - val_loss: 13.6700 - val_mae: 2.6781 Epoch 57/200 302/302 [==============================] - 0s 129us/sample - loss: 11.4704 - mae: 2.4421 - val_loss: 13.4948 - val_mae: 2.6543 Epoch 58/200 302/302 [==============================] - 0s 134us/sample - loss: 10.7862 - mae: 2.4057 - val_loss: 13.2519 - val_mae: 2.6094 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 40us/sample - loss: 12.3233 - mae: 2.4834 [CV] END learning_rate=0.002550344321025771, n_hidden=2, n_neurons=7; total time= 3.3s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 2ms/sample - loss: 570.5621 - mae: 21.9812 - val_loss: 499.1355 - val_mae: 20.9296 Epoch 2/200 302/302 [==============================] - 0s 132us/sample - loss: 461.2768 - mae: 20.2562 - val_loss: 396.3925 - val_mae: 18.9529 Epoch 3/200 302/302 [==============================] - 0s 131us/sample - loss: 352.2825 - mae: 17.7801 - val_loss: 287.4776 - val_mae: 16.0681 Epoch 4/200 302/302 [==============================] - 0s 139us/sample - loss: 241.6922 - mae: 14.5291 - val_loss: 181.9419 - val_mae: 12.4788 Epoch 5/200 302/302 [==============================] - 0s 138us/sample - loss: 146.8713 - mae: 10.8214 - val_loss: 104.2908 - val_mae: 9.1878 Epoch 6/200 302/302 [==============================] - 0s 135us/sample - loss: 85.8572 - mae: 7.9991 - val_loss: 63.5018 - val_mae: 6.9264 Epoch 7/200 302/302 [==============================] - 0s 141us/sample - loss: 54.2487 - mae: 6.2286 - val_loss: 42.5920 - val_mae: 5.4956 Epoch 8/200 302/302 [==============================] - 0s 133us/sample - loss: 36.1029 - mae: 5.0038 - val_loss: 29.9529 - val_mae: 4.4707 Epoch 9/200 302/302 [==============================] - 0s 134us/sample - loss: 24.2694 - mae: 3.9759 - val_loss: 23.5384 - val_mae: 3.8957 Epoch 10/200 302/302 [==============================] - 0s 130us/sample - loss: 18.7701 - mae: 3.3642 - val_loss: 19.5983 - val_mae: 3.3835 Epoch 11/200 302/302 [==============================] - 0s 129us/sample - loss: 17.4891 - mae: 3.1513 - val_loss: 19.4393 - val_mae: 3.2986 Epoch 12/200 302/302 [==============================] - 0s 131us/sample - loss: 16.1851 - mae: 3.0130 - val_loss: 19.2138 - val_mae: 3.2194 Epoch 13/200 302/302 [==============================] - 0s 131us/sample - loss: 15.5763 - mae: 2.9434 - val_loss: 20.8047 - val_mae: 3.2921 Epoch 14/200 302/302 [==============================] - 0s 133us/sample - loss: 14.8785 - mae: 2.8636 - val_loss: 17.8443 - val_mae: 3.0222 Epoch 15/200 302/302 [==============================] - 0s 129us/sample - loss: 14.4989 - mae: 2.8205 - val_loss: 18.0654 - val_mae: 3.0366 Epoch 16/200 302/302 [==============================] - 0s 131us/sample - loss: 14.0740 - mae: 2.7154 - val_loss: 17.8101 - val_mae: 3.0493 Epoch 17/200 302/302 [==============================] - 0s 132us/sample - loss: 13.5864 - mae: 2.7440 - val_loss: 17.5888 - val_mae: 2.9873 Epoch 18/200 302/302 [==============================] - 0s 131us/sample - loss: 13.7388 - mae: 2.6979 - val_loss: 16.7773 - val_mae: 2.8945 Epoch 19/200 302/302 [==============================] - 0s 130us/sample - loss: 13.3332 - mae: 2.6473 - val_loss: 16.9434 - val_mae: 2.9489 Epoch 20/200 302/302 [==============================] - 0s 134us/sample - loss: 13.2534 - mae: 2.6562 - val_loss: 18.5192 - val_mae: 3.0550 Epoch 21/200 302/302 [==============================] - 0s 138us/sample - loss: 13.2218 - mae: 2.6610 - val_loss: 18.7170 - val_mae: 3.0846 Epoch 22/200 302/302 [==============================] - 0s 129us/sample - loss: 13.1559 - mae: 2.6338 - val_loss: 19.6793 - val_mae: 3.1499 Epoch 23/200 302/302 [==============================] - 0s 134us/sample - loss: 12.6050 - mae: 2.5672 - val_loss: 16.8991 - val_mae: 2.8779 Epoch 24/200 302/302 [==============================] - 0s 128us/sample - loss: 11.9460 - mae: 2.4566 - val_loss: 16.7011 - val_mae: 2.9496 Epoch 25/200 302/302 [==============================] - 0s 126us/sample - loss: 12.2146 - mae: 2.5907 - val_loss: 16.0727 - val_mae: 2.8689 Epoch 26/200 302/302 [==============================] - 0s 132us/sample - loss: 12.1859 - mae: 2.5294 - val_loss: 15.9387 - val_mae: 2.8320 Epoch 27/200 302/302 [==============================] - 0s 127us/sample - loss: 11.9095 - mae: 2.5038 - val_loss: 15.9521 - val_mae: 2.8443 Epoch 28/200 302/302 [==============================] - 0s 128us/sample - loss: 11.1900 - mae: 2.4337 - val_loss: 16.7219 - val_mae: 2.9079 Epoch 29/200 302/302 [==============================] - 0s 130us/sample - loss: 12.0832 - mae: 2.5506 - val_loss: 15.6391 - val_mae: 2.7927 Epoch 30/200 302/302 [==============================] - 0s 134us/sample - loss: 11.2926 - mae: 2.4525 - val_loss: 15.4455 - val_mae: 2.7992 Epoch 31/200 302/302 [==============================] - 0s 140us/sample - loss: 11.1978 - mae: 2.4311 - val_loss: 15.8081 - val_mae: 2.8586 Epoch 32/200 302/302 [==============================] - 0s 127us/sample - loss: 10.7869 - mae: 2.4177 - val_loss: 15.2439 - val_mae: 2.7884 Epoch 33/200 302/302 [==============================] - 0s 134us/sample - loss: 11.6035 - mae: 2.5366 - val_loss: 15.1181 - val_mae: 2.8091 Epoch 34/200 302/302 [==============================] - 0s 132us/sample - loss: 11.2121 - mae: 2.4328 - val_loss: 14.7313 - val_mae: 2.7264 Epoch 35/200 302/302 [==============================] - 0s 135us/sample - loss: 10.7356 - mae: 2.4045 - val_loss: 15.0150 - val_mae: 2.7796 Epoch 36/200 302/302 [==============================] - 0s 135us/sample - loss: 10.7308 - mae: 2.3964 - val_loss: 16.0701 - val_mae: 2.9253 Epoch 37/200 302/302 [==============================] - 0s 135us/sample - loss: 12.1231 - mae: 2.5504 - val_loss: 16.1755 - val_mae: 2.8735 Epoch 38/200 302/302 [==============================] - 0s 137us/sample - loss: 10.9369 - mae: 2.3804 - val_loss: 15.4997 - val_mae: 2.8471 Epoch 39/200 302/302 [==============================] - 0s 131us/sample - loss: 10.7304 - mae: 2.4163 - val_loss: 14.6216 - val_mae: 2.7434 Epoch 40/200 302/302 [==============================] - 0s 135us/sample - loss: 10.6600 - mae: 2.3855 - val_loss: 15.0412 - val_mae: 2.7591 Epoch 41/200 302/302 [==============================] - 0s 131us/sample - loss: 10.3148 - mae: 2.3536 - val_loss: 14.3955 - val_mae: 2.7335 Epoch 42/200 302/302 [==============================] - 0s 132us/sample - loss: 10.2437 - mae: 2.3953 - val_loss: 14.2945 - val_mae: 2.6689 Epoch 43/200 302/302 [==============================] - 0s 126us/sample - loss: 10.6033 - mae: 2.4307 - val_loss: 15.1418 - val_mae: 2.8340 Epoch 44/200 302/302 [==============================] - 0s 133us/sample - loss: 10.8401 - mae: 2.4461 - val_loss: 14.9749 - val_mae: 2.7706 Epoch 45/200 302/302 [==============================] - 0s 129us/sample - loss: 10.5872 - mae: 2.3914 - val_loss: 14.5851 - val_mae: 2.7556 Epoch 46/200 302/302 [==============================] - 0s 127us/sample - loss: 10.4254 - mae: 2.3786 - val_loss: 14.8194 - val_mae: 2.7800 Epoch 47/200 302/302 [==============================] - 0s 130us/sample - loss: 10.0405 - mae: 2.3288 - val_loss: 15.8451 - val_mae: 2.8635 Epoch 48/200 302/302 [==============================] - 0s 132us/sample - loss: 10.2284 - mae: 2.3846 - val_loss: 14.3193 - val_mae: 2.6824 Epoch 49/200 302/302 [==============================] - 0s 134us/sample - loss: 10.0076 - mae: 2.3355 - val_loss: 14.3178 - val_mae: 2.7233 Epoch 50/200 302/302 [==============================] - 0s 125us/sample - loss: 10.2870 - mae: 2.3660 - val_loss: 14.5456 - val_mae: 2.7139 Epoch 51/200 302/302 [==============================] - 0s 136us/sample - loss: 10.5962 - mae: 2.4414 - val_loss: 14.2999 - val_mae: 2.7461 Epoch 52/200 302/302 [==============================] - 0s 133us/sample - loss: 9.7586 - mae: 2.3241 - val_loss: 14.5584 - val_mae: 2.7560 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 44us/sample - loss: 11.8825 - mae: 2.6466 [CV] END learning_rate=0.002550344321025771, n_hidden=2, n_neurons=7; total time= 2.8s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 4ms/sample - loss: 8805.6641 - mae: 57.1921 - val_loss: 556.8948 - val_mae: 17.7676 Epoch 2/200 301/301 [==============================] - 0s 391us/sample - loss: 292.6573 - mae: 12.4139 - val_loss: 85.6909 - val_mae: 6.5611 Epoch 3/200 301/301 [==============================] - 0s 388us/sample - loss: 74.6260 - mae: 6.3899 - val_loss: 53.3451 - val_mae: 5.0975 Epoch 4/200 301/301 [==============================] - 0s 384us/sample - loss: 34.2001 - mae: 4.0813 - val_loss: 37.1820 - val_mae: 3.9545 Epoch 5/200 301/301 [==============================] - 0s 395us/sample - loss: 26.7679 - mae: 3.4988 - val_loss: 36.5885 - val_mae: 4.1023 Epoch 6/200 301/301 [==============================] - 0s 392us/sample - loss: 30.0210 - mae: 3.7344 - val_loss: 52.2510 - val_mae: 4.6112 Epoch 7/200 301/301 [==============================] - 0s 383us/sample - loss: 43.3514 - mae: 4.6720 - val_loss: 86.5007 - val_mae: 7.5231 Epoch 8/200 301/301 [==============================] - 0s 371us/sample - loss: 65.5493 - mae: 6.1578 - val_loss: 47.5190 - val_mae: 5.1039 Epoch 9/200 301/301 [==============================] - 0s 375us/sample - loss: 53.2844 - mae: 5.0857 - val_loss: 45.8212 - val_mae: 4.3535 Epoch 10/200 301/301 [==============================] - 0s 375us/sample - loss: 36.1220 - mae: 4.2443 - val_loss: 23.4112 - val_mae: 3.0190 Epoch 11/200 301/301 [==============================] - 0s 372us/sample - loss: 23.0881 - mae: 3.2303 - val_loss: 42.7470 - val_mae: 4.1724 Epoch 12/200 301/301 [==============================] - 0s 375us/sample - loss: 24.6727 - mae: 3.2284 - val_loss: 21.7698 - val_mae: 2.9739 Epoch 13/200 301/301 [==============================] - 0s 373us/sample - loss: 27.2255 - mae: 3.3700 - val_loss: 26.3070 - val_mae: 3.7467 Epoch 14/200 301/301 [==============================] - 0s 372us/sample - loss: 40.3753 - mae: 4.5057 - val_loss: 49.9283 - val_mae: 5.2746 Epoch 15/200 301/301 [==============================] - 0s 371us/sample - loss: 26.4796 - mae: 3.5639 - val_loss: 22.4645 - val_mae: 3.0443 Epoch 16/200 301/301 [==============================] - 0s 364us/sample - loss: 19.9816 - mae: 2.9074 - val_loss: 26.6777 - val_mae: 3.5543 Epoch 17/200 301/301 [==============================] - 0s 374us/sample - loss: 43.2132 - mae: 3.8929 - val_loss: 322.1850 - val_mae: 8.8913 Epoch 18/200 301/301 [==============================] - 0s 374us/sample - loss: 42.0614 - mae: 3.9878 - val_loss: 25.5229 - val_mae: 3.3445 Epoch 19/200 301/301 [==============================] - 0s 361us/sample - loss: 32.8628 - mae: 3.6419 - val_loss: 39.2395 - val_mae: 5.0021 Epoch 20/200 301/301 [==============================] - 0s 367us/sample - loss: 29.1957 - mae: 3.7895 - val_loss: 38.9073 - val_mae: 4.0658 Epoch 21/200 301/301 [==============================] - 0s 369us/sample - loss: 25.7689 - mae: 3.3961 - val_loss: 42.7749 - val_mae: 4.2579 Epoch 22/200 301/301 [==============================] - 0s 363us/sample - loss: 39.9792 - mae: 4.7172 - val_loss: 105.0326 - val_mae: 7.8540 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 129us/sample - loss: 80.4126 - mae: 6.6022 [CV] END learning_rate=0.010681406083821229, n_hidden=5, n_neurons=89; total time= 3.6s Train on 301 samples, validate on 135 samples Epoch 1/200 301/301 [==============================] - 1s 4ms/sample - loss: 23884.3808 - mae: 92.6273 - val_loss: 408.3074 - val_mae: 15.3304 Epoch 2/200 301/301 [==============================] - 0s 371us/sample - loss: 436.7972 - mae: 14.7493 - val_loss: 106.1173 - val_mae: 7.1403 Epoch 3/200 301/301 [==============================] - 0s 375us/sample - loss: 53.1591 - mae: 5.0387 - val_loss: 76.4702 - val_mae: 6.5277 Epoch 4/200 301/301 [==============================] - 0s 374us/sample - loss: 46.5870 - mae: 4.9152 - val_loss: 34.2769 - val_mae: 4.3270 Epoch 5/200 301/301 [==============================] - 0s 374us/sample - loss: 30.6814 - mae: 3.8686 - val_loss: 50.5489 - val_mae: 5.8443 Epoch 6/200 301/301 [==============================] - 0s 368us/sample - loss: 41.7198 - mae: 4.7473 - val_loss: 80.7327 - val_mae: 7.7303 Epoch 7/200 301/301 [==============================] - 0s 371us/sample - loss: 65.8469 - mae: 6.2747 - val_loss: 51.5077 - val_mae: 5.2165 Epoch 8/200 301/301 [==============================] - 0s 371us/sample - loss: 95.7567 - mae: 7.4123 - val_loss: 52.0894 - val_mae: 5.2532 Epoch 9/200 301/301 [==============================] - 0s 372us/sample - loss: 34.4399 - mae: 4.1020 - val_loss: 42.2582 - val_mae: 4.3940 Epoch 10/200 301/301 [==============================] - 0s 372us/sample - loss: 30.9775 - mae: 3.9800 - val_loss: 48.2176 - val_mae: 5.1531 Epoch 11/200 301/301 [==============================] - 0s 372us/sample - loss: 249.4659 - mae: 9.8497 - val_loss: 238.2219 - val_mae: 11.9166 Epoch 12/200 301/301 [==============================] - 0s 370us/sample - loss: 72.9038 - mae: 5.7267 - val_loss: 36.2558 - val_mae: 4.5424 Epoch 13/200 301/301 [==============================] - 0s 361us/sample - loss: 40.7257 - mae: 4.4675 - val_loss: 23.7045 - val_mae: 3.4436 Epoch 14/200 301/301 [==============================] - 0s 366us/sample - loss: 23.2895 - mae: 3.1566 - val_loss: 23.3845 - val_mae: 3.1599 Epoch 15/200 301/301 [==============================] - 0s 368us/sample - loss: 21.9917 - mae: 3.0713 - val_loss: 29.4938 - val_mae: 3.6797 Epoch 16/200 301/301 [==============================] - 0s 370us/sample - loss: 31.0404 - mae: 3.6523 - val_loss: 27.2828 - val_mae: 3.3000 Epoch 17/200 301/301 [==============================] - 0s 374us/sample - loss: 32.1532 - mae: 3.7500 - val_loss: 26.4909 - val_mae: 3.3207 Epoch 18/200 301/301 [==============================] - 0s 365us/sample - loss: 34.0368 - mae: 3.9655 - val_loss: 35.5335 - val_mae: 3.4752 Epoch 19/200 301/301 [==============================] - 0s 369us/sample - loss: 27.4102 - mae: 3.5300 - val_loss: 46.6693 - val_mae: 4.4868 Epoch 20/200 301/301 [==============================] - 0s 360us/sample - loss: 27.4142 - mae: 3.4918 - val_loss: 32.5143 - val_mae: 3.6088 Epoch 21/200 301/301 [==============================] - 0s 370us/sample - loss: 29.9894 - mae: 3.5143 - val_loss: 32.1760 - val_mae: 3.6830 Epoch 22/200 301/301 [==============================] - 0s 361us/sample - loss: 23.0128 - mae: 3.0983 - val_loss: 23.1988 - val_mae: 3.2387 Epoch 23/200 301/301 [==============================] - 0s 367us/sample - loss: 24.6354 - mae: 3.1769 - val_loss: 42.1304 - val_mae: 4.2954 Epoch 24/200 301/301 [==============================] - 0s 367us/sample - loss: 33.5964 - mae: 4.0012 - val_loss: 26.3647 - val_mae: 3.5510 Epoch 25/200 301/301 [==============================] - 0s 363us/sample - loss: 52.2316 - mae: 4.7052 - val_loss: 97.4677 - val_mae: 8.2147 Epoch 26/200 301/301 [==============================] - 0s 371us/sample - loss: 251.9650 - mae: 6.9048 - val_loss: 22.7825 - val_mae: 3.1696 Epoch 27/200 301/301 [==============================] - 0s 359us/sample - loss: 36.6087 - mae: 4.1661 - val_loss: 36.7896 - val_mae: 4.8234 Epoch 28/200 301/301 [==============================] - 0s 371us/sample - loss: 696.4612 - mae: 5.6811 - val_loss: 83.9895 - val_mae: 6.5422 Epoch 29/200 301/301 [==============================] - 0s 369us/sample - loss: 36.6065 - mae: 4.1172 - val_loss: 21.4796 - val_mae: 2.9223 Epoch 30/200 301/301 [==============================] - 0s 355us/sample - loss: 38.5157 - mae: 4.0236 - val_loss: 46.1765 - val_mae: 5.6191 Epoch 31/200 301/301 [==============================] - 0s 368us/sample - loss: 2394.8926 - mae: 22.4371 - val_loss: 84.2110 - val_mae: 7.2118 Epoch 32/200 301/301 [==============================] - 0s 362us/sample - loss: 65.0746 - mae: 5.9396 - val_loss: 79.2887 - val_mae: 7.4279 Epoch 33/200 301/301 [==============================] - 0s 368us/sample - loss: 41.0852 - mae: 4.5894 - val_loss: 21.4788 - val_mae: 2.7987 Epoch 34/200 301/301 [==============================] - 0s 365us/sample - loss: 22.3072 - mae: 2.9919 - val_loss: 38.9868 - val_mae: 5.0881 Epoch 35/200 301/301 [==============================] - 0s 365us/sample - loss: 30.5782 - mae: 4.0519 - val_loss: 21.8967 - val_mae: 3.1709 Epoch 36/200 301/301 [==============================] - 0s 367us/sample - loss: 25.0508 - mae: 3.3946 - val_loss: 22.7785 - val_mae: 3.0935 Epoch 37/200 301/301 [==============================] - 0s 365us/sample - loss: 27.5810 - mae: 3.5953 - val_loss: 29.9011 - val_mae: 3.4330 Epoch 38/200 301/301 [==============================] - 0s 361us/sample - loss: 36.2268 - mae: 4.1077 - val_loss: 192.7644 - val_mae: 12.6476 Epoch 39/200 301/301 [==============================] - 0s 367us/sample - loss: 52.7511 - mae: 5.4153 - val_loss: 34.6121 - val_mae: 4.2887 Epoch 40/200 301/301 [==============================] - 0s 366us/sample - loss: 20.9878 - mae: 2.9003 - val_loss: 22.6331 - val_mae: 3.4066 Epoch 41/200 301/301 [==============================] - 0s 358us/sample - loss: 85.4160 - mae: 6.5361 - val_loss: 81.3484 - val_mae: 7.2222 Epoch 42/200 301/301 [==============================] - 0s 365us/sample - loss: 10825.5143 - mae: 13.0240 - val_loss: 68.4008 - val_mae: 5.4180 Epoch 43/200 301/301 [==============================] - 0s 364us/sample - loss: 36.7605 - mae: 4.0952 - val_loss: 169.7339 - val_mae: 4.4454 101/1 [======================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 125us/sample - loss: 30.8983 - mae: 3.6201 [CV] END learning_rate=0.010681406083821229, n_hidden=5, n_neurons=89; total time= 6.1s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 4ms/sample - loss: 2417.9133 - mae: 32.2527 - val_loss: 425.5278 - val_mae: 17.0605 Epoch 2/200 302/302 [==============================] - 0s 369us/sample - loss: 184.8877 - mae: 10.4116 - val_loss: 126.0588 - val_mae: 9.5432 Epoch 3/200 302/302 [==============================] - 0s 371us/sample - loss: 79.9769 - mae: 6.5545 - val_loss: 77.4787 - val_mae: 6.1905 Epoch 4/200 302/302 [==============================] - 0s 372us/sample - loss: 52.8168 - mae: 5.0994 - val_loss: 59.2185 - val_mae: 5.4422 Epoch 5/200 302/302 [==============================] - 0s 372us/sample - loss: 40.8601 - mae: 4.6746 - val_loss: 55.9858 - val_mae: 5.7886 Epoch 6/200 302/302 [==============================] - 0s 375us/sample - loss: 58.3077 - mae: 5.0694 - val_loss: 56.7525 - val_mae: 5.1386 Epoch 7/200 302/302 [==============================] - 0s 384us/sample - loss: 67.8318 - mae: 6.0765 - val_loss: 24.2311 - val_mae: 3.3508 Epoch 8/200 302/302 [==============================] - 0s 373us/sample - loss: 28.8745 - mae: 3.5893 - val_loss: 35.5997 - val_mae: 3.8429 Epoch 9/200 302/302 [==============================] - 0s 364us/sample - loss: 34.8519 - mae: 4.0544 - val_loss: 97.3364 - val_mae: 7.1553 Epoch 10/200 302/302 [==============================] - 0s 378us/sample - loss: 58.4877 - mae: 5.2015 - val_loss: 25.0272 - val_mae: 3.0762 Epoch 11/200 302/302 [==============================] - 0s 374us/sample - loss: 23.7640 - mae: 3.2064 - val_loss: 24.3574 - val_mae: 3.5932 Epoch 12/200 302/302 [==============================] - 0s 369us/sample - loss: 25.5511 - mae: 3.2396 - val_loss: 36.4423 - val_mae: 3.9094 Epoch 13/200 302/302 [==============================] - 0s 363us/sample - loss: 32.4703 - mae: 3.7923 - val_loss: 36.2871 - val_mae: 4.6650 Epoch 14/200 302/302 [==============================] - 0s 365us/sample - loss: 38.7605 - mae: 4.4948 - val_loss: 27.7580 - val_mae: 3.4007 Epoch 15/200 302/302 [==============================] - 0s 362us/sample - loss: 25.4758 - mae: 3.4408 - val_loss: 27.0030 - val_mae: 3.2135 Epoch 16/200 302/302 [==============================] - 0s 361us/sample - loss: 19.7754 - mae: 2.8701 - val_loss: 27.2531 - val_mae: 3.3798 Epoch 17/200 302/302 [==============================] - 0s 366us/sample - loss: 19.3969 - mae: 2.8848 - val_loss: 54.3727 - val_mae: 5.2705 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 130us/sample - loss: 60.2862 - mae: 5.3911 [CV] END learning_rate=0.010681406083821229, n_hidden=5, n_neurons=89; total time= 3.0s Train on 302 samples, validate on 135 samples Epoch 1/200 302/302 [==============================] - 1s 4ms/sample - loss: 40195.6962 - mae: 102.4402 - val_loss: 1432.1276 - val_mae: 32.1059 Epoch 2/200 302/302 [==============================] - 0s 381us/sample - loss: 418.5850 - mae: 15.7484 - val_loss: 133.3250 - val_mae: 8.5107 Epoch 3/200 302/302 [==============================] - 0s 396us/sample - loss: 81.9414 - mae: 6.6630 - val_loss: 111.7893 - val_mae: 8.5425 Epoch 4/200 302/302 [==============================] - 0s 388us/sample - loss: 61.6473 - mae: 6.1165 - val_loss: 46.6716 - val_mae: 5.0746 Epoch 5/200 302/302 [==============================] - 0s 384us/sample - loss: 39.3759 - mae: 4.5569 - val_loss: 49.2511 - val_mae: 4.8976 Epoch 6/200 302/302 [==============================] - 0s 379us/sample - loss: 58.2469 - mae: 5.6797 - val_loss: 92.4582 - val_mae: 7.1447 Epoch 7/200 302/302 [==============================] - 0s 386us/sample - loss: 109.6476 - mae: 7.5530 - val_loss: 64.2166 - val_mae: 5.9808 Epoch 8/200 302/302 [==============================] - 0s 386us/sample - loss: 49.2041 - mae: 5.1997 - val_loss: 31.9639 - val_mae: 3.7627 Epoch 9/200 302/302 [==============================] - 0s 389us/sample - loss: 35.7164 - mae: 4.0791 - val_loss: 42.6363 - val_mae: 4.4603 Epoch 10/200 302/302 [==============================] - 0s 390us/sample - loss: 37.9956 - mae: 4.1706 - val_loss: 40.0487 - val_mae: 4.2172 Epoch 11/200 302/302 [==============================] - 0s 390us/sample - loss: 34.5212 - mae: 3.8739 - val_loss: 43.3696 - val_mae: 4.6619 Epoch 12/200 302/302 [==============================] - 0s 376us/sample - loss: 57.2109 - mae: 5.1603 - val_loss: 61.4577 - val_mae: 5.5187 Epoch 13/200 302/302 [==============================] - 0s 377us/sample - loss: 55.4313 - mae: 5.1870 - val_loss: 77.9793 - val_mae: 6.3099 Epoch 14/200 302/302 [==============================] - 0s 366us/sample - loss: 33.8017 - mae: 3.9212 - val_loss: 53.7677 - val_mae: 4.8034 Epoch 15/200 302/302 [==============================] - 0s 377us/sample - loss: 29.2482 - mae: 3.6499 - val_loss: 40.2464 - val_mae: 4.2100 Epoch 16/200 302/302 [==============================] - 0s 385us/sample - loss: 28.5694 - mae: 3.5083 - val_loss: 25.7512 - val_mae: 3.2365 Epoch 17/200 302/302 [==============================] - 0s 383us/sample - loss: 42.0901 - mae: 4.3290 - val_loss: 41.2022 - val_mae: 5.0034 Epoch 18/200 302/302 [==============================] - 0s 381us/sample - loss: 38.2788 - mae: 4.5549 - val_loss: 49.7326 - val_mae: 4.8939 Epoch 19/200 302/302 [==============================] - 0s 386us/sample - loss: 28.6342 - mae: 3.5741 - val_loss: 34.3443 - val_mae: 4.3361 Epoch 20/200 302/302 [==============================] - 0s 379us/sample - loss: 22.9229 - mae: 3.1679 - val_loss: 35.8828 - val_mae: 3.9280 Epoch 21/200 302/302 [==============================] - 0s 405us/sample - loss: 18.8627 - mae: 2.7809 - val_loss: 43.9077 - val_mae: 4.3201 Epoch 22/200 302/302 [==============================] - 0s 389us/sample - loss: 27.1390 - mae: 3.5346 - val_loss: 46.3250 - val_mae: 5.4332 Epoch 23/200 302/302 [==============================] - 0s 366us/sample - loss: 30.2050 - mae: 4.0224 - val_loss: 35.3301 - val_mae: 4.7349 Epoch 24/200 302/302 [==============================] - 0s 364us/sample - loss: 169.9130 - mae: 6.6170 - val_loss: 1609.0817 - val_mae: 16.1756 Epoch 25/200 302/302 [==============================] - 0s 362us/sample - loss: 379.2927 - mae: 6.8494 - val_loss: 33.8978 - val_mae: 3.6797 Epoch 26/200 302/302 [==============================] - 0s 365us/sample - loss: 27.4120 - mae: 3.3760 - val_loss: 31.0179 - val_mae: 3.8263 100/1 [========================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 124us/sample - loss: 29.3708 - mae: 4.1337 [CV] END learning_rate=0.010681406083821229, n_hidden=5, n_neurons=89; total time= 4.1s Train on 402 samples, validate on 135 samples Epoch 1/200 402/402 [==============================] - 1s 2ms/sample - loss: 511.0362 - mae: 21.3255 - val_loss: 406.3507 - val_mae: 19.2863 Epoch 2/200 402/402 [==============================] - 0s 108us/sample - loss: 350.7660 - mae: 17.6256 - val_loss: 260.5049 - val_mae: 14.9219 Epoch 3/200 402/402 [==============================] - 0s 111us/sample - loss: 226.7260 - mae: 13.6106 - val_loss: 177.9084 - val_mae: 11.8389 Epoch 4/200 402/402 [==============================] - 0s 110us/sample - loss: 160.3424 - mae: 11.2467 - val_loss: 130.0682 - val_mae: 10.0586 Epoch 5/200 402/402 [==============================] - 0s 105us/sample - loss: 112.7537 - mae: 9.2087 - val_loss: 85.8867 - val_mae: 7.9827 Epoch 6/200 402/402 [==============================] - 0s 110us/sample - loss: 68.0301 - mae: 6.9444 - val_loss: 44.8859 - val_mae: 5.5784 Epoch 7/200 402/402 [==============================] - 0s 109us/sample - loss: 35.2599 - mae: 4.8155 - val_loss: 24.3145 - val_mae: 3.7591 Epoch 8/200 402/402 [==============================] - 0s 114us/sample - loss: 22.9556 - mae: 3.7553 - val_loss: 24.4043 - val_mae: 3.7684 Epoch 9/200 402/402 [==============================] - 0s 114us/sample - loss: 20.7924 - mae: 3.5799 - val_loss: 20.4411 - val_mae: 3.5020 Epoch 10/200 402/402 [==============================] - 0s 107us/sample - loss: 19.1544 - mae: 3.3969 - val_loss: 19.9720 - val_mae: 3.4695 Epoch 11/200 402/402 [==============================] - 0s 108us/sample - loss: 18.2817 - mae: 3.3313 - val_loss: 19.9687 - val_mae: 3.4500 Epoch 12/200 402/402 [==============================] - 0s 101us/sample - loss: 17.2390 - mae: 3.2504 - val_loss: 20.3614 - val_mae: 3.4807 Epoch 13/200 402/402 [==============================] - 0s 106us/sample - loss: 16.9260 - mae: 3.1379 - val_loss: 18.8611 - val_mae: 3.3387 Epoch 14/200 402/402 [==============================] - 0s 108us/sample - loss: 16.0715 - mae: 3.0552 - val_loss: 17.9584 - val_mae: 3.1892 Epoch 15/200 402/402 [==============================] - 0s 105us/sample - loss: 15.6082 - mae: 2.9583 - val_loss: 19.4818 - val_mae: 3.3303 Epoch 16/200 402/402 [==============================] - 0s 107us/sample - loss: 15.4274 - mae: 2.9465 - val_loss: 16.9722 - val_mae: 3.0843 Epoch 17/200 402/402 [==============================] - 0s 107us/sample - loss: 15.4677 - mae: 2.9557 - val_loss: 16.6100 - val_mae: 3.0795 Epoch 18/200 402/402 [==============================] - 0s 114us/sample - loss: 14.3007 - mae: 2.8348 - val_loss: 18.1489 - val_mae: 3.1492 Epoch 19/200 402/402 [==============================] - 0s 110us/sample - loss: 14.1992 - mae: 2.8241 - val_loss: 18.5040 - val_mae: 3.2013 Epoch 20/200 402/402 [==============================] - 0s 110us/sample - loss: 14.4875 - mae: 2.8599 - val_loss: 16.0104 - val_mae: 2.9281 Epoch 21/200 402/402 [==============================] - 0s 107us/sample - loss: 13.4448 - mae: 2.7111 - val_loss: 14.9901 - val_mae: 2.8467 Epoch 22/200 402/402 [==============================] - 0s 110us/sample - loss: 13.1950 - mae: 2.7163 - val_loss: 15.4645 - val_mae: 2.8841 Epoch 23/200 402/402 [==============================] - 0s 109us/sample - loss: 13.3900 - mae: 2.7425 - val_loss: 14.6058 - val_mae: 2.8155 Epoch 24/200 402/402 [==============================] - 0s 107us/sample - loss: 12.5902 - mae: 2.6638 - val_loss: 15.7680 - val_mae: 2.9154 Epoch 25/200 402/402 [==============================] - 0s 109us/sample - loss: 12.2543 - mae: 2.6073 - val_loss: 14.9629 - val_mae: 2.8747 Epoch 26/200 402/402 [==============================] - 0s 100us/sample - loss: 12.2305 - mae: 2.5863 - val_loss: 14.1897 - val_mae: 2.7857 Epoch 27/200 402/402 [==============================] - 0s 107us/sample - loss: 12.1959 - mae: 2.5886 - val_loss: 14.0687 - val_mae: 2.7438 Epoch 28/200 402/402 [==============================] - 0s 100us/sample - loss: 11.7734 - mae: 2.5329 - val_loss: 14.5564 - val_mae: 2.8345 Epoch 29/200 402/402 [==============================] - 0s 105us/sample - loss: 11.8787 - mae: 2.5471 - val_loss: 13.6656 - val_mae: 2.7088 Epoch 30/200 402/402 [==============================] - 0s 108us/sample - loss: 11.5304 - mae: 2.5459 - val_loss: 15.3781 - val_mae: 2.8902 Epoch 31/200 402/402 [==============================] - 0s 105us/sample - loss: 12.1313 - mae: 2.6093 - val_loss: 13.8172 - val_mae: 2.6827 Epoch 32/200 402/402 [==============================] - 0s 106us/sample - loss: 12.1850 - mae: 2.5815 - val_loss: 13.9839 - val_mae: 2.7264 Epoch 33/200 402/402 [==============================] - 0s 106us/sample - loss: 11.1388 - mae: 2.4661 - val_loss: 13.6392 - val_mae: 2.6844 Epoch 34/200 402/402 [==============================] - 0s 106us/sample - loss: 11.3796 - mae: 2.4943 - val_loss: 13.6646 - val_mae: 2.6524 Epoch 35/200 402/402 [==============================] - 0s 106us/sample - loss: 11.7697 - mae: 2.5432 - val_loss: 14.1027 - val_mae: 2.7616 Epoch 36/200 402/402 [==============================] - 0s 107us/sample - loss: 11.2167 - mae: 2.4656 - val_loss: 13.2973 - val_mae: 2.6417 Epoch 37/200 402/402 [==============================] - 0s 112us/sample - loss: 11.4152 - mae: 2.4630 - val_loss: 13.5953 - val_mae: 2.6699 Epoch 38/200 402/402 [==============================] - 0s 110us/sample - loss: 11.0199 - mae: 2.4321 - val_loss: 13.7693 - val_mae: 2.6929 Epoch 39/200 402/402 [==============================] - 0s 107us/sample - loss: 11.2958 - mae: 2.4417 - val_loss: 13.5866 - val_mae: 2.6557 Epoch 40/200 402/402 [==============================] - 0s 102us/sample - loss: 10.8775 - mae: 2.4247 - val_loss: 13.1809 - val_mae: 2.6087 Epoch 41/200 402/402 [==============================] - 0s 105us/sample - loss: 10.9703 - mae: 2.4201 - val_loss: 13.3532 - val_mae: 2.5739 Epoch 42/200 402/402 [==============================] - 0s 101us/sample - loss: 10.8954 - mae: 2.4123 - val_loss: 14.3120 - val_mae: 2.6743 Epoch 43/200 402/402 [==============================] - 0s 105us/sample - loss: 11.2717 - mae: 2.4542 - val_loss: 13.3841 - val_mae: 2.6414 Epoch 44/200 402/402 [==============================] - 0s 105us/sample - loss: 11.4968 - mae: 2.4995 - val_loss: 16.0431 - val_mae: 2.9799 Epoch 45/200 402/402 [==============================] - 0s 107us/sample - loss: 11.0914 - mae: 2.4211 - val_loss: 13.4739 - val_mae: 2.6692 Epoch 46/200 402/402 [==============================] - 0s 102us/sample - loss: 10.8501 - mae: 2.4599 - val_loss: 13.1793 - val_mae: 2.5485 Epoch 47/200 402/402 [==============================] - 0s 109us/sample - loss: 11.0700 - mae: 2.4216 - val_loss: 13.5228 - val_mae: 2.6810 Epoch 48/200 402/402 [==============================] - 0s 110us/sample - loss: 10.8487 - mae: 2.4080 - val_loss: 13.6473 - val_mae: 2.6642 Epoch 49/200 402/402 [==============================] - 0s 108us/sample - loss: 11.1496 - mae: 2.4178 - val_loss: 14.8874 - val_mae: 2.8428 Epoch 50/200 402/402 [==============================] - 0s 101us/sample - loss: 10.7263 - mae: 2.3853 - val_loss: 13.9077 - val_mae: 2.6719 Epoch 51/200 402/402 [==============================] - 0s 107us/sample - loss: 11.1927 - mae: 2.4397 - val_loss: 13.1856 - val_mae: 2.5945 Epoch 52/200 402/402 [==============================] - 0s 105us/sample - loss: 10.9107 - mae: 2.4249 - val_loss: 13.1707 - val_mae: 2.5416 Epoch 53/200 402/402 [==============================] - 0s 107us/sample - loss: 10.7912 - mae: 2.3755 - val_loss: 13.0885 - val_mae: 2.5512 Epoch 54/200 402/402 [==============================] - 0s 107us/sample - loss: 10.7283 - mae: 2.4254 - val_loss: 13.6588 - val_mae: 2.5763 Epoch 55/200 402/402 [==============================] - 0s 104us/sample - loss: 10.3954 - mae: 2.3359 - val_loss: 13.5815 - val_mae: 2.6233 Epoch 56/200 402/402 [==============================] - 0s 108us/sample - loss: 11.0829 - mae: 2.4129 - val_loss: 13.9176 - val_mae: 2.6431 Epoch 57/200 402/402 [==============================] - 0s 100us/sample - loss: 10.7975 - mae: 2.3783 - val_loss: 14.0985 - val_mae: 2.6562 Epoch 58/200 402/402 [==============================] - 0s 107us/sample - loss: 10.8375 - mae: 2.3896 - val_loss: 13.4389 - val_mae: 2.6195 Epoch 59/200 402/402 [==============================] - 0s 110us/sample - loss: 10.6277 - mae: 2.3789 - val_loss: 13.6857 - val_mae: 2.6244 Epoch 60/200 402/402 [==============================] - 0s 105us/sample - loss: 10.8203 - mae: 2.3914 - val_loss: 15.1075 - val_mae: 2.7499 Epoch 61/200 402/402 [==============================] - 0s 106us/sample - loss: 11.5188 - mae: 2.4037 - val_loss: 15.0016 - val_mae: 2.6899 Epoch 62/200 402/402 [==============================] - 0s 106us/sample - loss: 11.1371 - mae: 2.4362 - val_loss: 14.0883 - val_mae: 2.6705 Epoch 63/200 402/402 [==============================] - 0s 105us/sample - loss: 10.7710 - mae: 2.3869 - val_loss: 13.4727 - val_mae: 2.5892
RandomizedSearchCV(cv=4,
estimator=<tensorflow.python.keras.wrappers.scikit_learn.KerasRegressor object at 0x7fc7026a1790>,
param_distributions={'learning_rate': [0.0016533386369064133,
0.0031824466021970047,
0.0018845915673734637,
0.0036089914604343796,
0.020919172116534344,
0.0126486372267292,
0.005359930448345483,
0.02226733989414441,
0.002321605314271074,
0.0...
0.005708519661026795,
0.017399804582171642,
0.005434453028180617,
0.002965554626272023,
0.01162804847051205,
0.007034590931032023,
0.0017174123739803564,
0.010241571463592046,
0.008713230592980898,
0.011185005559772096,
0.0025147830750944554, ...],
'n_hidden': [1, 2, 3, 5],
'n_neurons': [1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 14, 15,
16, 17, 18, 19, 20, 21,
22, 23, 24, 25, 26, 27,
28, 29, 30, ...]},
verbose=2)
# Best params values
rnd_search_cv.best_params_
{'n_neurons': 7, 'n_hidden': 2, 'learning_rate': 0.002550344321025771}
# Best MSE score. Notice it's a negative number here because sklern wants larger values
# for better models when optimising models. Hence -MSE (with negative sign) was used as score.
rnd_search_cv.best_score_
-11.739369096331076
results
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Linear Regression | 3.628309 | 22.525745 | 4.746129 | 0.776707 | 4.655392 | 3.599192 |
| 1 | Ridge Regression | 3.624469 | 22.512906 | 4.744777 | 0.776834 | 4.655186 | 3.598106 |
| 2 | Lasso Regression | 3.581572 | 22.351453 | 4.727732 | 0.778435 | 4.645083 | 3.577086 |
| 4 | ElasticNet Regression | 3.583389 | 22.361371 | 4.728781 | 0.778337 | 4.645421 | 3.578348 |
model_neural_networks = rnd_search_cv.best_estimator_.model
model_neural_networks
<tensorflow.python.keras.engine.sequential.Sequential at 0x7fc6da94cdd0>
model_neural_networks.evaluate(X_test, y_test)
231/1 [==================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 0s 44us/sample - loss: 10.7535 - mae: 2.3376
[11.17687193139807, 2.3376129]
model = Lasso(alpha=0.12)
model.fit(X_train,y_train)
pred = model.predict(X_test)
results_lasso = pd.DataFrame(data=[['Lasso Regression', *evaluation(y_test, pred), rmse_cross_validation(model),mae_cross_validation(model)]],
columns=['Model', 'MAE', 'MSE', 'RMSE', 'R2 Square', 'RMSE_CV','MAE_CV'])
results_lasso
| Model | MAE | MSE | RMSE | R2 Square | RMSE_CV | MAE_CV | |
|---|---|---|---|---|---|---|---|
| 0 | Lasso Regression | 3.132207 | 17.826285 | 4.222119 | 0.823998 | 4.645083 | 3.577086 |
We can observe that, when compared to Lasso, the Neural Network model fits the data substantially better, with significantly better MSE and MAE results. Neural networks act like a general-purpose program that identifies patterns and can generate output patterns that match them. Because of the huge number of neurons involved, it is capable of doing so. As a result, the complexity and quantity of connections are replacing the complexity of traditional applications. Before Lasso saturates, it selects a maximum of n variables. Lasso is unable to do group selection. When there is a collection of variables with very high pairwise correlations, the Lasso prefers to pick only one variable from the group at random. This is one of Lasso limitations which may have affected its performance.